- Comparison operator used in STL/deque - 10 Updates
Bonita Montero <Bonita.Montero@gmail.com>: Apr 06 06:06PM +0200 > On Thu, 6 Apr 2023 17:42:21 +0200 >> Use strA.c_str(). > Bit of an ugly hack really. Shouldn't be necessary. Then you have a lot of work if you do it your way. I prefer this easier way which I don't consider as ugly. |
Bonita Montero <Bonita.Montero@gmail.com>: Apr 06 06:07PM +0200 > I don't know how compare() works in this case , but presumably "count" is the > shortest length of the two strings? ... That's what I also assumed. The caller is responsible to decide what to do if both strings are equal up to count. |
Daniel <danielaparker@gmail.com>: Apr 06 09:27AM -0700 > >size_t count ) > I don't know how compare() works in this case , but presumably "count" is the > shortest length of the two strings? Yes. See https://en.cppreference.com/w/cpp/string/basic_string/compare for an explanation of how std::basic_string::compare uses std::char_traits::compare. If std::char_traits::compare returns 0, std::basic_string::compare returns 0 if the two strings are the same size, < 0 if the first string is shorter, and > 0 if the second string is shorter. Daniel |
Bo Persson <bo@bo-persson.se>: Apr 06 07:23PM +0200 >> In 2023, there would be little point in having a case insensitive compare f= >> unction where only ASCII letters could be compared in a case-insensitive wa= > Even ascii only would be better that nothing. Who says std::string uses ASCII? |
scott@slp53.sl.home (Scott Lurndal): Apr 06 05:36PM >>> unction where only ASCII letters could be compared in a case-insensitive wa= >> Even ascii only would be better that nothing. >Who says std::string uses ASCII? Given ASCII is a proper subset of UTF-8.... Granted there are degenerate operating systems that still use 16-bit wide characters, but that's an abberation :-). |
Daniel <danielaparker@gmail.com>: Apr 06 12:21PM -0700 On Thursday, April 6, 2023 at 1:36:58 PM UTC-4, Scott Lurndal wrote: > >> Even ascii only would be better that nothing. > >Who says std::string uses ASCII? > Given ASCII is a proper subset of UTF-8.... True, but I thought Bo's point was that since the bytes in a std::string could be in any 8-bit encoding , could be ASCII, UTF-8, ISO 8859-1, GB2312, EBCDIC, it would be problematic to add functions to std::string that assume a particular encoding (such as ASCII.) > Granted there are degenerate operating systems that still use 16-bit > wide characters, but that's an abberation :-). What do you mean by "character"? The term "character" isn't defined by Unicode. In a modern language, I think a char or character would most sensibly be defined as a Unicode Scalar Value (as it is in rust.) Daniel |
Keith Thompson <Keith.S.Thompson+u@gmail.com>: Apr 06 12:44PM -0700 >> >On 2023-04-06 at 16:13, Mut...@dastardlyhq.com wrote: >> >> On Thu, 6 Apr 2023 07:07:55 -0700 (PDT) >> >> Daniel <daniel...@gmail.com> wrote: [...] > any 8-bit encoding , could be ASCII, UTF-8, ISO 8859-1, GB2312, EBCDIC, > it would be problematic to add functions to std::string that assume a particular > encoding (such as ASCII.) Who says a byte is 8 bits? [...] -- Keith Thompson (The_Other_Keith) Keith.S.Thompson+u@gmail.com Working, but not speaking, for XCOM Labs void Void(void) { Void(); } /* The recursive call of the void */ |
Daniel <danielaparker@gmail.com>: Apr 06 01:12PM -0700 On Thursday, April 6, 2023 at 3:44:34 PM UTC-4, Keith Thompson wrote: > > it would be problematic to add functions to std::string that assume a particular > > encoding (such as ASCII.) > Who says a byte is 8 bits? Well, nobody said that :-) The reference to number of bits referred to the encodings. Admittedly, ASCII is a 7 bit encoding. As a long as a C++ byte is at least 8 bits, it can hold an encoding that requires 7 or 8 bits. Daniel |
"Chris M. Thomasson" <chris.m.thomasson.1@gmail.com>: Apr 06 02:37PM -0700 On 4/6/2023 12:44 PM, Keith Thompson wrote: >> encoding (such as ASCII.) > Who says a byte is 8 bits? > [...] This does: A bit of sarcasm. __________________ #include <iostream> #include <cstdint> #include <climits> typedef std::uint8_t byte; #if ((UINT8_MAX != 255UL) || (CHAR_BIT != 8)) # error foobar
Subscribe to:
Post Comments (Atom)
|
No comments:
Post a Comment