Wednesday, November 24, 2021

Digest for comp.lang.c++@googlegroups.com - 25 updates in 5 topics

Juha Nieminen <nospam@thanks.invalid>: Nov 24 08:06AM

> lower_bound and upper_bound need < and > comparison.
 
I'm pretty certain they only require that the elements are comparable
with std::less() (which means that operator<() is enough).
Bonita Montero <Bonita.Montero@gmail.com>: Nov 24 06:18PM +0100

Am 24.11.2021 um 09:06 schrieb Juha Nieminen:
>> lower_bound and upper_bound need < and > comparison.
 
> I'm pretty certain they only require that the elements are comparable
> with std::less() (which means that operator<() is enough).
 
I used the predicated version because I don't want to overload
less myself, which is a larger effort than a lambda. But you
can use lower_bound and upper_bound with asymmetrical predicates.
Lynn McGuire <lynnmcguire5@gmail.com>: Nov 23 07:00PM -0600

On 11/23/2021 1:21 AM, Juha Nieminen wrote:
> actually causes inefficiencies in modern CPU architectures (because
> cache misses are very expensive).
 
> Oh well. As long as it works, who cares? Buy a faster computer.
 
Not in California.

https://nichegamer.com/high-end-gaming-pcs-banned-in-six-us-states-after-california-energy-bill-limits-sales-on-high-performance-pcs/
 
Lynn
Lynn McGuire <lynnmcguire5@gmail.com>: Nov 23 07:01PM -0600

On 11/23/2021 6:17 AM, Otto J. Makela wrote:
> longer done using C or C++, and the programs that are run (and still
> being developed) were originally created in the era when CPU power was
> much lower than these days, so code optimization was more important.
 
I write C++ and Fortran code just about every day for our software products.
 
Lynn
Richard Damon <Richard@Damon-Family.org>: Nov 23 09:27PM -0500

On 11/23/21 1:53 PM, Bonita Montero wrote:
>> if it is reuses that data instead of making a copy, at least until in
>> wants to change it. ...
 
> Then use string_view.
 
Doesn't work.
 
I have a lot of long lived object that store a string with a 'name' of
the object.
 
Most are created with a fixed compile time name that I pass as a const
char* string to a read only literal.
 
A few cases need to dynaically create a name based on specific usages,
so these need that name stored as a dynamic value, but that memory wants
to be reclaimed if/when the long lived object does go away.
 
string_view doesn't help here, as when I create the few dynamic names,
there is no place to keep that char array that is holding the name.
om@iki.fi (Otto J. Makela): Nov 24 11:33AM +0200

>> optimization was more important.
 
> I write C++ and Fortran code just about every day for our software
> products.
 
Do you consider this to be a common situation?
--
/* * * Otto J. Makela <om@iki.fi> * * * * * * * * * */
/* Phone: +358 40 765 5772, ICBM: N 60 10' E 24 55' */
/* Mail: Mechelininkatu 26 B 27, FI-00100 Helsinki */
/* * * Computers Rule 01001111 01001011 * * * * * * */
om@iki.fi (Otto J. Makela): Nov 24 11:40AM +0200

> farms are taking up a siginificant percentage of the planet's
> electrical output its beholder on programmers to make their code
> as efficient as is reasonable.
 
I keep rereading that first sentence, is the meaning reversed?
 
--
/* * * Otto J. Makela <om@iki.fi> * * * * * * * * * */
/* Phone: +358 40 765 5772, ICBM: N 60 10' E 24 55' */
/* Mail: Mechelininkatu 26 B 27, FI-00100 Helsinki */
/* * * Computers Rule 01001111 01001011 * * * * * * */
Philipp Klaus Krause <pkk@spth.de>: Nov 24 11:43AM +0100

Am 23.11.21 um 00:25 schrieb Richard Damon:
> some measures.
 
> Fast code will be more energy efficient as processors basically use
> power based on the number of instructions executed (and memory accessed).
 
The paper mentions that that is a common assumption, but not true.
 
Looking at table 4, we see that C is in the top spot both when it comes
to fast code and energy efficient code. And many other languages also
rank similarly in energy consumption as they do in speed. But there are
others. E.g.:
 
Go is fast (only 183% slower than C), but energy inefficient (323% more
energy consumed than C).
 
Lisp is slow (240% slower than C) but energy efficient (127% more energy
consumed than C).
Juha Nieminen <nospam@thanks.invalid>: Nov 24 10:54AM


> Humm... I was just thinking along the lines of 'what does the C code
> actually do?' If it ends up blasting the system, how green is it? Well,
> yeah, that is a different point. You are right Juha.
 
I am assuming that the article considers the cases where the CPU is at
full load in all the cases, with the program having been implemented
with different programming languages. The faster the program is done,
the less overall energy it will require. If a C program does the job
in 1 second and a PHP program does it in 30 seconds, it's obvious that
the C program is going to consume less energy for the same task.
 
So it's not a question of "how many % of the CPU can I use with this
programming language", but "how long do I need to use the CPU in
order to perform this task".
Juha Nieminen <nospam@thanks.invalid>: Nov 24 10:55AM

> Le 23/11/2021 à 08:21, Juha Nieminen a écrit :
>> Oh well. As long as it works, who cares? Buy a faster computer.
 
> This sentence shows that you quite obviously got what "green" means.
 
I have no idea what you are talking about.
 
Maybe you missed the fact that that sentence you quoted is sarcastic?
Philipp Klaus Krause <pkk@spth.de>: Nov 24 11:56AM +0100

Am 23.11.21 um 16:03 schrieb Manfred:
>> energy consump- tion
 
> I find the correlation slower/faster language respectively less/more
> energy quite confusing. In fact I believe it is the opposite.
 
Well, the do state in the paper that the common assumption is that
faster languages are more energy efficient. And their data supports that
this assumption actually hold for many languages.
 
But the surprising, and thus interesting result from their paper is that
this is not universally true. You can see one example when looking at
the data for Go and Lisp in the paper:
 
Go is fast (only 183% slower than C), but energy inefficient (323% more
energy consumed than C).
 
Lisp is slow (240% slower than C) but energy efficient (127% more energy
consumed than C).
Philipp Klaus Krause <pkk@spth.de>: Nov 24 12:01PM +0100

Am 24.11.21 um 11:54 schrieb Juha Nieminen:
 
> So it's not a question of "how many % of the CPU can I use with this
> programming language", but "how long do I need to use the CPU in
> order to perform this task".
 
On the other hand, the energy consumption of that CPU during that time
also depends on what it is doing (and to some degree an what data it is
processing). People have been doing side-channel attacks on cryptography
by very accurately measuring the energy consumption of the computers
doing the encrypting or decrypting.
 
Long ago, for fun, I did measure the energy consumption of a Z80 (in a
ColecoVision) with reasonably high time resolution (current probe on
oscilloscope). From the trace of the energy consumption, I could see
which instructions were executed.
David Brown <david.brown@hesbynett.no>: Nov 24 01:58PM +0100

On 24/11/2021 12:01, Philipp Klaus Krause wrote:
> ColecoVision) with reasonably high time resolution (current probe on
> oscilloscope). From the trace of the energy consumption, I could see
> which instructions were executed.
 
I used to debug my assembly code on my ZX Spectrum (with a Z80 cpu) by
the sound of the power supply.
 
However, with modern cpus there is so much going on that you are
unlikely to get much detail by tracking power consumption. You could
perhaps distinguish when blocks such as SIMD or the FPU are active, but
otherwise there are so many instructions in flight at a time that you
could not identify them. Differential power analysis for cryptoanalysis
is almost entirely a thing of the past. (It could be practical for
simpler processors and dedicated cryptography devices, except that
designers of these things are usually aware of such attacks and they are
easily blocked.)
"Öö Tiib" <ootiib@hot.ee>: Nov 24 05:54AM -0800

On Wednesday, 24 November 2021 at 12:55:19 UTC+2, Juha Nieminen wrote:
 
> > This sentence shows that you quite obviously got what "green" means.
 
> I have no idea what you are talking about.
 
> Maybe you missed the fact that that sentence you quoted is sarcastic?
 
He didn't, he just wanted to troll, not worth to reply.
Bart <bc@freeuk.com>: Nov 24 01:59PM

On 23/11/2021 22:46, David Brown wrote:
 
> Yes. But that is because I understand the difference between compiling
> code and /running/ compiled code. As long as your compilation process
> is not so slow that it hinders the development process,
 
I find compilers such as gcc slower enough that they would hinder /me/.
I can build my 3 main language tools, about 120Kloc and 100 files, in
1/4 second, the same time it takes gcc to build hello.c, about 4 lines
in one file.
 
> want as efficient results as you can get from the source code). At no
> point is the effort or energy required by the compiler at all relevant
> in the total sum.
 
It was a example of the sort of program that I can make run efficiently
by making some extra effort, and also keeping things small, which
actually was not written in C.
 
It isn't about language, except that everything else being equal, a C
implementation of a computationally intensive application like mine is
going to be faster than an equivalent CPython version, and mostly likely
still faster than PyPy.
 
So if gcc was implemented in CPython, compiling hello.c might takes 10
seconds instead of 0.25 seconds, but tcc does the job in 0.03 seconds.
 
Consider that on my machine, even an empty program takes 0.02 seconds to
run...
"Öö Tiib" <ootiib@hot.ee>: Nov 24 06:03AM -0800

On Wednesday, 24 November 2021 at 12:56:28 UTC+2, Philipp Klaus Krause wrote:
> energy consumed than C).
 
> Lisp is slow (240% slower than C) but energy efficient (127% more energy
> consumed than C).
 
That is probably because of different concurrency support. Dozen years old
language (like Go) is anticipated to be better designed to support concurrency
than 5 dozens years old language (like Lisp).
Manfred <noname@add.invalid>: Nov 24 03:30PM +0100

On 11/23/2021 11:35 PM, David Brown wrote:
> On 23/11/2021 20:59, Manfred wrote:
[...]
 
> Absolutely. I remember reading that the first Itanium processors had
> higher power densities than the core of a nuclear reactor - these
> devices handle a lot of power in a small space, and it all ends up as heat.
 
About anecdotes..
I remember during my university days a professor showing the difference
in power management for different technologies; something like a 1MB RAM
chip built on TTL would have to dissipate kilowatts of power....
 
 
Bart <bc@freeuk.com>: Nov 24 02:43PM

On 24/11/2021 14:30, Manfred wrote:
> I remember during my university days a professor showing the difference
> in power management for different technologies; something like a 1MB RAM
> chip built on TTL would have to dissipate kilowatts of power....
 
Well, using the 74LS189 device, which contains 64 bits of RAM, then you
would have more practical problems first, such as needing 130,000 of the
things to make 1MB, plus all the address decoding circuitry.
 
It might be quite a few kilowatts you'd need.
DozingDog@thekennel.co: Nov 24 03:48PM

On Tue, 23 Nov 2021 12:28:12 -0500
>> there's probably not much if any extra effort in using C over C++.
 
>I would disagree. With a decent compiler, C code can generate close to
>assembly level optimizations for most problems. (Maybe it doesn't have
 
So can a C++ compiler and with modern additions such constexpr C++ can optimise
in ways that C simply can't.
 
>ANYTHING in terms of data-structures that another language can generate,
>you can generate in C.
 
True, but you have to re-invent the wheel each time because frankly the level
of complex data structure library support in the standard library in C is
woeful. Eg the hsearch() and bsearch() functionality are frankly rubbish (only
1 tree/hash per process!) and Berkeley DB is a PITA to use.
 
>is exactly why you can do things possibly more efficiently then the
>compiler. You could have always generated the same algorithm that the
>compiler did.
 
I doubt I could for example write a RB tree system better than that implemented
in the C++ STL. It has 30 years of refining behind it.
 
>Now, if you want to talk of the efficiency of WRITING the code, (as
>opposed to executing it) all this power it gives you is a negative,
>which seems to be what you are talking about.
 
Indeed, but your argument that C always produces more efficient code than C++
probably hasn't been true for 20 years.
DozingDog@thekennel.co: Nov 24 03:49PM

On Wed, 24 Nov 2021 11:40:54 +0200
>> electrical output its beholder on programmers to make their code
>> as efficient as is reasonable.
 
>I keep rereading that first sentence, is the meaning reversed?
 
No.
scott@slp53.sl.home (Scott Lurndal): Nov 24 04:04PM

>> cache misses are very expensive).
 
>> Oh well. As long as it works, who cares? Buy a faster computer.
 
>Not in California.
 
And Hawaii, Colorado, Oregon, Vermont and Washington State.
 
Why did you single out California?
 
And one can certainly purchase the most powerful computers in all of those
states by assembling them personally - note also that servers aren't covered
by the regulation, only desktop systems.
 
Did you actually read the article you posted a link to?
 
Here's an article without the panic, and with actual, you know, facts.
 
https://www.makeuseof.com/why-the-california-ban-on-power-hogging-pcs-is-a-good-thing/
Richard Damon <Richard@Damon-Family.org>: Nov 24 11:54AM -0500

> of complex data structure library support in the standard library in C is
> woeful. Eg the hsearch() and bsearch() functionality are frankly rubbish (only
> 1 tree/hash per process!) and Berkeley DB is a PITA to use.
 
Right, I never said it was the BEST way, especially if you include
programmer productivity.
 
If your goal is ABSOLUTELY TOP CPU EFFICIENCY, then 'standard libraries'
become just optional (use only if it IS the best for YOUR job).
 
Yes, this might mean 3 years of programming to say a millisecond of
execution, but it still puts the hand crafted code ahead of the higher
level langauge in RAW efficiency.
 
>> compiler did.
 
> I doubt I could for example write a RB tree system better than that implemented
> in the C++ STL. It has 30 years of refining behind it.
 
So you can directly implement that algorithm in C. yes, it gets messier,
but you CAN do it.
 
And, yes, it also means you need to research (or look at implementations
and 'borrow') the best methods. Lots of programmer effort to get the
greenest code.
 
>> which seems to be what you are talking about.
 
> Indeed, but your argument that C always produces more efficient code than C++
> probably hasn't been true for 20 years.
 
FLAW: Not ALWAYS, but CAN, big difference.
 
Yes, C++ idioms can generate complex code faster than C, and allows
things to be put into libraries that C can't readily do, and thus an
expert can right the library and 'lend' that expertise to an ordinary
code by them using the library. A custom coded C version can do just as
well, because you can express the same instruction sequences in C as
were generated in C++, you just need to be or explicit about it.
Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 06:38PM -0800

Hello,
 
 
More of my philosophy about the nature of our universe and the other universes..
 
I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
I think i am smart, and i think that the nature of our universe is
that the lowest layer of the subatomic is also made of a kind of
"diversification", like in an Ant colony, and i think that this diversification comes from a kind of disorder from evolutive composition from the other universes that are disorder, but i think that there is a special thing to notice that for example the individual subatomic of time seems like made of disorder, but the composition of all the subatomic layer gives an emergence of order or intelligence like the time that we know it with its characteristic of being relative.
 
More of my philosophy about the distributed intelligence and more..
 
I think i am smart, so i think the intelligence of an Ant colony emerges from a "distributed" intelligence, so an individual Ant can be specialized like in our kind of civilization, but even if an Ant is specialized and he can be viewed as much less capable of constructing the intelligence of an Ant colony, the distributed intelligence of the Ant colony can make emerge the intelligence of an Ant colony, so i think that time is the same, the individual subatomic of time can be like more specialized, but all the subatomic layer can give emergence to time as we know it and to relativity of time.
 
More of my philosophy about what is time as we know it..
 
I think i am smart, and i think you know me much more now, and now
i will explain my point of view of what is time as we know it:
 
I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind of intelligence that knows how to process food, so i think time that is relative as has said it Einstein is also an "emergence", but when you look at time individually from the subatomic point of view you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony.
 
 
 
Thank you,
Amine Moulay Ramdane.
Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 06:11PM -0800

Hello,
 
 
More of my philosophy about the distributed intelligence and more..
 
I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
I think i am smart, so i think the intelligence of an Ant colony emerges from a "distributed" intelligence, so an individual Ant can be specialized like in our kind of civilization, but even if an Ant is specialized and he can be viewed as much less capable of constructing the intelligence of an Ant colony, the distributed intelligence of the Ant colony can make emerge the intelligence of an Ant colony, so i think that time is the same, the individual subatomic of time can be like more specialized, but all the subatomic layer can give emergence to time as we know it and to relativity of time.
 
More of my philosophy about what is time as we know it..
 
I think i am smart, and i think you know me much more now, and now
i will explain my point of view of what is time as we know it:
 
I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind of intelligence that knows how to process food, so i think time that is relative as has said it Einstein is also an "emergence", but when you look at time individually from the subatomic point of view you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony.
 
 
 
Thank you,
Amine Moulay Ramdane.
Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 05:47PM -0800

Hello,
 
 
More of my philosophy about what is time as we know it..
 
I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
I think i am smart, and i think you know me much more now, and now
i will explain my point of view of what is time as we know it:
 
I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind intelligence that knows how to process food, so i think
time that is relative as has said it Einstein is also an "emergence",
but when you look at time individually from the subatomic point of view
you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony.
 
 
 
Thank you,
Amine Moulay Ramdane.
You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page.
To unsubscribe from this group and stop receiving emails from it send an email to comp.lang.c+++unsubscribe@googlegroups.com.

No comments: