Thursday, October 4, 2018

Digest for comp.lang.c++@googlegroups.com - 25 updates in 3 topics

James Kuyper <jameskuyper@alumni.caltech.edu>: Oct 03 07:45PM -0400

On 10/03/2018 06:06 PM, Paavo Helde wrote:
...
 
> char x = 65;
> printf("%c", x);
> printf("%d", x);
 
Keep in mind that you can do exactly the same thing in C++, using the
exact same code. It would be more appropriate to compare <cstdio> and
<iostream>.
Paavo Helde <myfirstname@osa.pri.ee>: Oct 04 08:47AM +0300

On 4.10.2018 2:45, James Kuyper wrote:
 
> Keep in mind that you can do exactly the same thing in C++, using the
> exact same code. It would be more appropriate to compare <cstdio> and
> <iostream>.
 
printf() is not typesafe so cannot be really advocated for C++ usage.
What one can do is to create a typesafe wrapper around printf, I'm using
a home-grown one which allows me to write e.g.
 
char x = 65;
std::cout << Sprintf("%c %d\n")(x)(x);
 
OUTPUT: A 65
 
There are other typesafe formatting libraries, but nothing in the
standard AFAIK. Regretfully I see that Boost.Format gets the %d+char
output wrong, at least in the version I have.
 
There is a standards proposal
http://open-std.org/JTC1/SC22/WG21/docs/papers/2013/n3716.html
which says for %d "as if it is formatted by snprintf" so hopefully it
would get it right.
Ralf Goertz <me@myprovider.invalid>: Oct 04 10:57AM +0200

Am Wed, 03 Oct 2018 12:48:11 -0700
> To me that's the bottom line: a name should say what it means,
> and not more, and not less. The types [u]int8_t hardly ever do
> that.
 
+1
David Brown <david.brown@hesbynett.no>: Oct 04 01:47PM +0200

On 03/10/18 21:48, Tim Rentsch wrote:
> particular using 'char' for holding an integer value is nutty
> (not counting cases where the values come from character
> constants or things like that).
 
I agree entirely.
 
> question are meant only to hold small integer values, to avoid
> confusion with other uses of character types, and unsigned char in
> particular.
 
Agreed.
 
> complement representation, or not? Similarly, for 'uint8_t u;',
> is it important that its partner type use two's complement? In
> most cases I suspect it isn't.
 
I agree that in many cases, the fact that a type is two's complement is
not important. But baring /really/ odd machines - so odd that for
almost everyone, the possibility can be ignored - your integer types
/will/ be two's complement. And though the C standards say "int8_t"
must be two's complement, the name itself does not. I think generally
the range of the type - knowing it is from -128 to 127 - is more
important than the representation.
 
To me, "int8_t" says "8-bit signed integer". It is as simple as that.
So when I want an 8-bit signed integer, "int8_t" is as good a name as
you can get.
 
It might be that people want just a "small integer", in which case 8-bit
might be more specific than they need. And for that purpose,
int_least8_t is not a typename that rolls of the tongue.
 
> whereas others (which IIUC includes you) treat [u]int8_t as
> completely separate from the "character" aspects of the [un]signed
> char types.
 
I think it is a disadvantage that the types are - in all but
hypothetical implementations - synonyms for character types. It mixes
different things. I'd prefer a clearer separation between types for
holding characters (which might have different sizes for different
encoding systems, but for which "signed" and "unsigned" make no sense to
me), types for dealing with raw memory (bypassing the "strict aliasing"
and rules, and preferably available in different sizes), and integer types.
 
> want to use uint8_t, because it has other connotational baggage
> that I really don't want to convey. Similarly for a small and
> signed type.
 
To me, the only "baggage" is that the size is specific at 8 bits. But I
usually see that as an advantage - I like to know exactly what ranges I
have and what space is taken up. That may be because of the kind of
programming I do, and may not be the same for other people.
 
> To me that's the bottom line: a name should say what it means,
> and not more, and not less. The types [u]int8_t hardly ever do
> that.
 
It seems that these type names say something a little different to you
and to me, and that we have slightly different needs from them. So I am
happy with the names int8_t and uint8_t, but I can understand your
points for why you don't find them ideal.
 
However, you haven't addressed the elephant in the room here - what
would /you/ suggest as names for types here, and what characteristics
would you prefer them to have?
David Brown <david.brown@hesbynett.no>: Oct 04 01:54PM +0200

On 04/10/18 07:47, Paavo Helde wrote:
>> exact same code. It would be more appropriate to compare <cstdio> and
>> <iostream>.
 
> printf() is not typesafe so cannot be really advocated for C++ usage.
 
Of course printf is fine to use in C++. It has exactly the same
advantages and disadvantages as it has in C. std::cout has many good
features in C++, but it also has at least three glaring problems in
comparison to printf - the complexity of translating strings, the
statefulness of things like outputting hex format, and the one you have
mentioned here.
 
 
For some compilers, as long as the format string of printf is fixed at
compile time the compiler can check the number and types of the
parameters. It is not as good as being type-safe, but it is a good help
for avoiding bugs.
 
 
Paavo Helde <myfirstname@osa.pri.ee>: Oct 04 03:14PM +0300

On 3.10.2018 22:48, Tim Rentsch wrote:
> particular using 'char' for holding an integer value is nutty
> (not counting cases where the values come from character
> constants or things like that).
 
Ouch, that hurts! I have recently spent a lot of time on converting and
storing numeric arrays in the smallest possible datatype, including
8-bit, in order to reduce the memory footprint and enhance the
performance with large arrays. It's sad to hear I'm not sensible ;-)
 
On a related note, lots of image formats support 8-bit pixels, both in
grayscale and as components of RGB colors. Whenever you see yet another
cat picture on your screen there has been a heavy amount of processing
8-bit data numerically.
scott@slp53.sl.home (Scott Lurndal): Oct 04 12:46PM

>> exact same code. It would be more appropriate to compare <cstdio> and
>> <iostream>.
 
>printf() is not typesafe so cannot be really advocated for C++ usage.
 
That's a fallacy. The former doesn't imply the latter.
David Brown <david.brown@hesbynett.no>: Oct 04 03:15PM +0200

On 04/10/18 14:14, Paavo Helde wrote:
> grayscale and as components of RGB colors. Whenever you see yet another
> cat picture on your screen there has been a heavy amount of processing
> 8-bit data numerically.
 
There is nothing "nutty" about using small numerical data - but there
/is/ something nutty about using types called "char", "signed char" or
"unsigned char" for it. (This is my own opinion - I am not trying to
speak for Tim. But it looks like we agree here.) Use types "int8_t" or
"uint8_t" instead - or, if you prefer, given them a different typedef'ed
name that matches the usage.
Paavo Helde <myfirstname@osa.pri.ee>: Oct 04 04:27PM +0300

On 4.10.2018 14:54, David Brown wrote:
> On 04/10/18 07:47, Paavo Helde wrote:
 
>> printf() is not typesafe so cannot be really advocated for C++ usage.
 
> Of course printf is fine to use in C++.
 
Yes, I agree it's fine to use. I just said it cannot be advocated,
trying to do that e.g. in this group would bring a huge backlash I'm sure.
 
I have tried to avoid printf() myself as I sometimes like to refactor
code massively which is dangerous in presence of printf and such.
However, now I checked and found out that my primary compiler now
detects printf format string warnings and can also turn them into errors
via '#pragma warning (error: 4477)' so maybe I should rethink my position.
Paavo Helde <myfirstname@osa.pri.ee>: Oct 04 04:59PM +0300

On 4.10.2018 16:15, David Brown wrote:
> speak for Tim. But it looks like we agree here.) Use types "int8_t" or
> "uint8_t" instead - or, if you prefer, given them a different typedef'ed
> name that matches the usage.
 
Yes, maybe I misunderstood Tim and he just talked about using the "char"
*name*. Agreed this should not be used for non-strings. However, the
problem is that std::int8_t and std::uint8_t are really just typedefs,
not real types, which makes them just cosmetics and makes it impossible
to fix the std stream behavior, for example.
 
Anyway, in my code the 8-bit types typically appear in this context as T
in template<typename T>, so the naming issue is moot.
James Kuyper <jameskuyper@alumni.caltech.edu>: Oct 04 10:07AM -0400

On 10/04/2018 08:14 AM, Paavo Helde wrote:
> storing numeric arrays in the smallest possible datatype, including
> 8-bit, in order to reduce the memory footprint and enhance the
> performance with large arrays. It's sad to hear I'm not sensible ;-)
 
Using the smallest possible datatype to store numeric arrays in is
perfectly sensible - but there are many different names that refer to
types which have that same minimum size: char, signed char, unsigned
char, intleast8_t, uintleast8_t, and if supported, int8_t and uint8_t.
Of all of those different names, "char" is by far the least appropriate
one to use for that purpose, because it's the only one where you cannot
be portably certain whether it's a signed or unsigned type. When using
them to store numbers, that's a critically important thing you need to
know. That's what's not sensible about your approach.
James Kuyper <jameskuyper@alumni.caltech.edu>: Oct 04 10:11AM -0400

On 10/04/2018 01:47 AM, Paavo Helde wrote:
 
> printf() is not typesafe so cannot be really advocated for C++ usage.
> What one can do is to create a typesafe wrapper around printf, I'm using
> a home-grown one which allows me to write e.g.
 
I was responding to a claim that, for this purpose, C is better than
C++. That judgement was made despite the fact that C's printf() is not
typesafe. Is the C++ version of printf() any less typesafe than the C one?
James Kuyper <jameskuyper@alumni.caltech.edu>: Oct 04 10:15AM -0400

On 10/04/2018 09:27 AM, Paavo Helde wrote:
 
>>> printf() is not typesafe so cannot be really advocated for C++ usage.
 
>> Of course printf is fine to use in C++.
 
> Yes, I agree it's fine to use. I just said it cannot be advocated,
 
It can't? I don't see what's preventing it from being advocated. The
advantage you originally described is perfectly real. It's not
necessarily important enough to justify using printf(), but I wouldn't
see anything wrong with someone deciding otherwise and expressing that
opinion here.
Paavo Helde <myfirstname@osa.pri.ee>: Oct 04 05:45PM +0300

On 4.10.2018 17:11, James Kuyper wrote:
 
> I was responding to a claim that, for this purpose, C is better than
> C++. That judgement was made despite the fact that C's printf() is not
> typesafe. Is the C++ version of printf() any less typesafe than the C one?
 
One can argue that it is, because the expectations about the language
and compiler are different. For example, if I am changing a type I
expect the C++ compiler to diagnose all code places which are now broken
because of this change. In C I know I cannot really expect that and have
to be much more careful and tedious when making changes.
Chris Vine <chris@cvine--nospam--.freeserve.co.uk>: Oct 04 05:15PM +0100

On Thu, 04 Oct 2018 16:59:57 +0300
> problem is that std::int8_t and std::uint8_t are really just typedefs,
> not real types, which makes them just cosmetics and makes it impossible
> to fix the std stream behavior, for example.
 
I don't seem to feel the angst others do with how the standard streams
print the value of the fixed or minimum sized integer types provided by
stdint.h. If your integer type is uint8_t and you want it to print out
with std::printf as an integer and not a character, you use the "%d" or
"%u" format specifier (it won't make any difference which) and not the
"%c" format specifier. If you want to print it out with std::ostream's
operator << as an integer and not a character, you cast it to int or
unsigned int when invoking operator <<.[1] It seems somewhat extreme
to break congruity with C and make all the fixed and minimum width
types in stdint.h their own distinct types just to please operator <<
and >>.
 
Likewise, if you want to print out the address held in a pointer, you
cast to void*. I don't see the problem. After all, by allowing casts
(for good reason) C++ is not soundly typed to begin with.
 
Chris
 
[1] In theory on some weird platform uint8_t might not be a typedef to
unsigned char, so maybe if you want to print it as a character you
should cast it to char although that seems somewhat pedantic.
jameskuyper@alumni.caltech.edu: Oct 04 09:48AM -0700

On Tuesday, October 2, 2018 at 11:47:22 AM UTC-4, Ralf Goertz wrote:
> }
 
> If uint8_t were not typedefed to unsigned char but a separate type then
> it would not even compile?
 
Correct. I was talking about the single object inserters and extractors.
Since values of type uint8_t get promoted to 'int', the inserter for int
should work with uint8_t objects. However, the extractor for int uses an
int&, and therefore won't work for uint8_t, and the same is true for
both the inserters and extractors that work with pointers to character
types, since integer promotions don't help with references and pointers.
Ian Collins <ian-news@hotmail.com>: Oct 04 08:01PM +1300

On 04/10/18 10:32, Rick C. Hodgin wrote:
 
>> There you go again, can't face the truth and fall back on "sin"...  Read what
>> Schweitzer has to say, her core faith is undamaged by the truth.
 
> I am telling you the truth, Ian.
 
So, did you read it?
 
> https://www.youtube.com/watch?v=00vBqYDBW5s&t=35m23s
 
> It is absolutely impossible for all life on Earth to have
> evolved from the Big Bang + aftermath.
 
There you go, jumping to the erroneous conclusion that the evolution of
DNA is only determined by probability. News flash! It isn't.
 
--
Ian
gazelle@shell.xmission.com (Kenny McCormack): Oct 04 11:42AM

In article <tlbtD.12016$BM.9907@fx09.am4>,
 
>> So criticizing you is wrong?
 
>Same old. Rick is insane, no point in discussing anything with insane
>person.
 
Agreed. I don't even bother engaging directly anymore. It used to be fun,
but no more. But it can still be fun to talk about/around him.
 
For us dedicated Rick watchers, the question boils down to "Insane or Tool?"
In fact, given how nuts the news is - and getting nuttier every day - the
question of "Insane or Tool?" applies to a lot of people nowadays.
 
And note that a subclass of Tool is "Tool pretending to be insane".
 
And, hey, that last comment almost makes this on-topic for a C++ group.
 
--
"Unattended children will be given an espresso and a free kitten."
Dan Purgert <dan@djph.net>: Oct 04 11:53AM

Kenny McCormack wrote:
> [...]
> For us dedicated Rick watchers, the question boils down to "Insane or
> Tool?"
 
I was unaware they were mutually-exclusive. Guess I missed an update to
the standards :)
 
 
--
|_|O|_| Registered Linux user #585947
|_|_|O| Github: https://github.com/dpurgert
|O|O|O| PGP: 05CA 9A50 3F2E 1335 4DC5 4AEE 8E11 DDF3 1279 A281
David Brown <david.brown@hesbynett.no>: Oct 04 02:07PM +0200

On 03/10/18 23:24, Rick C. Hodgin wrote:
>>>> and when? That is defamation.
 
>>> Multiple times. You have criticized me and called me names
>>> falsely. Most everyone here has done so.
 
Here's a clue - when almost everybody says you are wrong, then perhaps
it is because you /are/ wrong.
 
>> pretend something that they are not.
 
> It's the content and the way you criticize me, which is because I
> am teaching you about the author of the universe.
 
No, people criticise you when you write nonsense, in an inappropriate
way to people who are not interested, and because you refuse to give
honest and genuine answers when people ask questions. They criticise
your dishonesty, your hypocrisy, your hatred, your judgemental and
holier-than-thou attitude, your narcissism, your self-proclaimed
martyrdom, your arrogance.
 
No one is criticising God - since so few here believe there is such a
thing, the idea of criticising, insulting or rejecting god makes no
sense. I don't "reject god" any more than you reject the flying
spaghetti monster or pink unicorns. It is all personal - it is the
things /you/ say that people reject, criticise and complain about. And
it's time you took some responsibility for your actions and words, and
realised it is not Christianity, or religion, or God that people are
rejecting, it is /your/ posts here. Nothing more, nothing less - your
posts, your words, your fault.
gazelle@shell.xmission.com (Kenny McCormack): Oct 04 12:25PM

In article <slrnprbvq4.6s3.dan@xps-linux.djph.net>,
>> Tool?"
 
>I was unaware they were mutually-exclusive. Guess I missed an update to
>the standards :)
 
I get what you're saying, but, yes, in my terminology, they are mutually
exclusive.
 
By "Tool", I mean someone whose weird behavior is not explainable as
insanity. I.e., in the criminal justice system, insanity removes guilt.
If someone is judged insane, they are considered not guilty of whatever
whacko thing they did. In my terminology, to be judged a Tool, you have to
be guilty - in this case, guilty of trolling Usenet - and, as we've seen,
insane people are not guilty.
 
Personally, my best guess is that Rick is *not* insane, but he does a very
good job of playing so on Usenet. This makes him a Tool.
 
--
"Only a genius could lose a billion dollars running a casino."
"You know what they say: the house always loses."
"When life gives you lemons, don't pay taxes."
"Grab 'em by the p***y!"
Dan Purgert <dan@djph.net>: Oct 04 01:03PM

Kenny McCormack wrote:
 
> I get what you're saying, but, yes, in my terminology, they are mutually
> exclusive.
> [...]
 
Works for me.
 
 
--
|_|O|_| Registered Linux user #585947
|_|_|O| Github: https://github.com/dpurgert
|O|O|O| PGP: 05CA 9A50 3F2E 1335 4DC5 4AEE 8E11 DDF3 1279 A281
"Rick C. Hodgin" <rick.c.hodgin@gmail.com>: Oct 04 10:53AM -0400

On 10/4/2018 3:01 AM, Ian Collins wrote:
> There you go, jumping to the erroneous conclusion that the evolution of DNA
> is only determined by probability.  News flash! It isn't.
 
I did not jump to that conclusion. I know the answer, Ian. :-)
 
It's the total blindness sin pulls over people, Ian. Until
they come to Jesus and have their sin taken away, that blindness
will keep them shackled to falseness, unable to get up and move
to the truth.
 
--
Rick C. Hodgin
Melzzzzz <Melzzzzz@zzzzz.com>: Oct 04 03:13PM


> Agreed. I don't even bother engaging directly anymore. It used to be fun,
> but no more. But it can still be fun to talk about/around him.
 
> For us dedicated Rick watchers, the question boils down to "Insane or Tool?"
 
He cracked obviously. Sometime in 2000's when God reveiled him ultimate
truth. Loonies repeat endlessly same thing without reason and disregard
any argument. That's hallmark of looney ;)
 
 
--
press any key to continue or any other to quit...
David Brown <david.brown@hesbynett.no>: Oct 04 01:58PM +0200

On 03/10/18 22:55, Chris M. Thomasson wrote:
 
> I inappropriately used the term race-condition to refer to the actual
> busted fetch-and-add implementation used to generate "random" numbers.
> Sorry about that.
 
And I inappropriately took your name at face value and thought you meant
a data race as defined in the standards - so you have my apologies for that.
 
With that sorted out, I see no undefined behaviour and I agree that your
program demonstrates why synchronous primitives such as fetch-and-add
have to be atomic!
You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page.
To unsubscribe from this group and stop receiving emails from it send an email to comp.lang.c+++unsubscribe@googlegroups.com.

No comments: