Tuesday, March 29, 2016

Digest for comp.lang.c++@googlegroups.com - 25 updates in 4 topics

Christopher Pisz <nospam@notanaddress.com>: Mar 29 03:27PM -0500

Does there exist a list of operating systems/hardware that I can target
in C++?
 
My office mate is arguing with me that C# is now more portable than C++,
because...mono. He is making me angry.
 
 
--
I have chosen to troll filter/ignore all subthreads containing the
words: "Rick C. Hodgins", "Flibble", and "Islam"
So, I won't be able to see or respond to any such messages
---
Barry Schwarz <schwarzb@dqel.com>: Mar 29 01:56PM -0700

On Tue, 29 Mar 2016 15:27:06 -0500, Christopher Pisz
>in C++?
 
>My office mate is arguing with me that C# is now more portable than C++,
>because...mono. He is making me angry.
 
The language plays no part in how many systems the source code can be
compiled for. It depends solely on whether someone has developed a
compiler/linker/library for those systems.
 
And the number of systems is also mostly irrelevant. For your office
use, it only matters whether the necessary tools exist for a system
you want to target and whether those tools can be used on systems
available in your development suite.
 
--
Remove del for email
Wouter van Ooijen <wouter@voti.nl>: Mar 29 10:59PM +0200

Op 29-Mar-16 om 10:27 PM schreef Christopher Pisz:
> in C++?
 
> My office mate is arguing with me that C# is now more portable than C++,
> because...mono. He is making me angry.
 
Challenge him to mention a platform that supports C#, but not C++.
 
For the reverse you can take any small microcontroller, take a PIC12F200
if he wants a specific example.
 
For your list, start with all GCC targets.
 
Wouter van Ooijen
woodbrian77@gmail.com: Mar 28 06:21PM -0700

On Monday, March 28, 2016 at 1:53:23 PM UTC-5, Öö Tiib wrote:
> > std::list that's sorted.
 
> What? Set is built upon weighted binary tree. There is too large
> difference between sequence and tree.
 
They both make many more memory allocations compared to deque
or vector. They have more overhead related to their allocations
too. The same reasons people warn against using std::list
apply to std::set also.
 
> > I almost never use any of the standard containers other than vector
> > or deque.
 
> May be you have limited use-cases?
 
Sure.
 
> differently and so fits better than others into certain situations.
> People tend to draw various decision charts how to choose
> a container. http://i.stack.imgur.com/G70oT.png
 
There's no mention of anything but standard containers there.
There are other containers that should be considered.
Maybe that chart has been around a long time and never updated.
 
Brian
Ebenezer Enterprises - If you can't join 'em, beat 'em.
http://webEbenezer.net
"Öö Tiib" <ootiib@hot.ee>: Mar 29 12:11AM -0700

> or vector. They have more overhead related to their allocations
> too. The same reasons people warn against using std::list
> apply to std::set also.
 
Situation can still be where that does not matter even with standard
allocator. Also both templates accept custom allocator. Without
profiling it can be hard to tell if sorted 'std::vector' or 'std::set'
is better but it is quite likely that sorted 'std::list' does lose to
both there.
 
 
> There's no mention of anything but standard containers there.
> There are other containers that should be considered.
> Maybe that chart has been around a long time and never updated.
 
I am not claiming that the chart (or others like it) is somehow
pure and full truth. It just helps to make first pick from standard
library. For every single case something else can be better. For example
there are numerous containers and container adaptors in boost. I look at
boost mostly when I notice that standard container is likely inconvenient
for some reason or when I later notice that working with container takes
noteworthy time in application's profile. That however happens only with
one from 10 containers. For rest 9 from 10 the simple suggestion of such
chart is close enough to best.
David Brown <david.brown@hesbynett.no>: Mar 29 09:16AM +0200

> made it clear that std::set is just a std::list that's
> sorted. I almost never use any of the standard
> containers other than vector or deque.
 
I don't remember making such a post - but I don't remember /not/ making
such a post, so it is possible.
 
The different containers have their pros and cons. The ones you mention
here may be the most popular because they have a good balance for
general purpose usage, but that does not mean that the others are not
useful. In general, you pay for the features of the container. If you
are going to use those features, great - if not, it is a waste. So if
you need a simple sequence, your choices are:
 
std::array - very efficient, compile-time checking, but compile-time
fixed size.
 
std::deque - random access, but linear time insertion and removal, and
space costs.
 
std::list - simple and space efficient, but no random access
 
std::vector - fast access, but space inefficient and high worst-case
times for some operations
 
 
Then of course there are other container types such as the maps, queue
and stack, that have different features.
 
The C++ library standards group picked the containers as being a good
selection of general purpose, useful container types. They certainly
don't cover all possible types of container - but neither did the
authors think "all people /really/ need are vectors and deques. We'll
just throw in the others to make C++ hard, so that we can sell more books".
Juha Nieminen <nospam@thanks.invalid>: Mar 29 09:16AM

> or vector. They have more overhead related to their allocations
> too. The same reasons people warn against using std::list
> apply to std::set also.
 
std::set (as well as std::map) is actually useful in practice.
std::list not so much. (I'm talking about quite long experience.
I have used the relational data containers quite a lot, but I don't
remember ever using std::list for anything in an actual project.)
The situations where std::list would be useful are so rare that
the standard library wouldn't actually lose much if it didn't have it.
 
Of course now that we have std::unordered_set (and std::unordered_map),
they are often better alternatives (although the ordered versions are
still useful in situations where you benefit from them being ordered,
eg. you need to traverse the elements in order. This can sometimes be
quite useful.)
 
Of course one should be aware of the amount of allocations and other
overhead that they have, and not use them spuriously. Know your tools.
On the other hand, they don't need to be religiously avoided either,
if they are the best tools for the task at hand. Just try not to
add/remove elements in tight inner loops that are run millions of
times, and you should be fine.
 
--- news://freenews.netfront.net/ - complaints: news@netfront.net ---
Juha Nieminen <nospam@thanks.invalid>: Mar 29 09:19AM

> std::vector - fast access, but space inefficient and high worst-case
> times for some operations
 
How exactly is std::vector space inefficient?
 
std::vector may cause some minor memory fragmentation if it needs to
reallocate a lot, but other than that, what exactly is "space
inefficient" about it?
 
--- news://freenews.netfront.net/ - complaints: news@netfront.net ---
David Brown <david.brown@hesbynett.no>: Mar 29 12:13PM +0200

On 29/03/16 11:19, Juha Nieminen wrote:
 
> std::vector may cause some minor memory fragmentation if it needs to
> reallocate a lot, but other than that, what exactly is "space
> inefficient" about it?
 
If you take a vector, and keep adding more to it, then when its current
allocated space is empty, it will re-allocate with a bigger size. Each
time this happens, the extra size increases. How much overhead this is
will depend on how fast the size increases, and the usage pattern for
the vector. But for a simple example, if the extra size was 40% each
allocation (thus doubling the vector size every two expansions), you
would have a 20% average wasted overhead.
Juha Nieminen <nospam@thanks.invalid>: Mar 29 12:10PM

> the vector. But for a simple example, if the extra size was 40% each
> allocation (thus doubling the vector size every two expansions), you
> would have a 20% average wasted overhead.
 
If you have no idea how many elements there may be in the vector, that
may be so. On the other hand, if you know how many elements there will
be, you can either initialize them all at once, or use reserve() to
reserve an exact amount of space for them.
 
The way std::vector reallocates memory is, of course, done for
efficiency. It would be highly inefficient to reallocate one element
at a time (and insertion at the end of the vector would become a
linear-time operation rather than amortized constant-time.)
 
Note that std::deque may have similarly "wasted" space, because it
allocates memory in blocks. The newest block may have unused space
in it. How many % that is, depends on the size of the container and
the size of those blocks. (In fact, if you also insert at the beginning
of the container, there will also usually be unused space there as well.)
std::deque also requires some extra memory for the pointers to those
blocks, which is some (minor) overhead as well. And std::deque can't
be made to reserve exactly as much memory as needed for a given
amount of elements (unless you happen to hit an exact multiple
of the block size.)
 
--- news://freenews.netfront.net/ - complaints: news@netfront.net ---
David Brown <david.brown@hesbynett.no>: Mar 29 02:33PM +0200

On 29/03/16 14:10, Juha Nieminen wrote:
> may be so. On the other hand, if you know how many elements there will
> be, you can either initialize them all at once, or use reserve() to
> reserve an exact amount of space for them.
 
Absolutely.
 
And if you are entirely sure how many elements there will be, you can
use a std::array.
 
I have nothing against vector, nor do I think it is particularly
inefficient. I am just pointing out that there are pros and cons to the
various container types, and it is wrong to say that vector and deque
are all that are ever needed.
 
Mr Flibble <flibble@i42.co.uk>: Mar 29 02:38PM +0100

> or vector. They have more overhead related to their allocations
> too. The same reasons people warn against using std::list
> apply to std::set also.
 
std::deque also makes O(n) memory allocations like std::list and
std::set so you clearly do not know what you are talking about.
 
/Flibble
Mr Flibble <flibble@i42.co.uk>: Mar 29 02:48PM +0100

On 29/03/2016 10:16, Juha Nieminen wrote:
> remember ever using std::list for anything in an actual project.)
> The situations where std::list would be useful are so rare that
> the standard library wouldn't actually lose much if it didn't have it.
 
Absolute bollocks mate; I use std::list slightly more than I use
std::set in my codebase: not all lists need to be ordered.
 
Just because YOU are unfamiliar with the common use-cases for std::list
does not mean that std::list is redundant.
 
/Flibble
"Alf P. Steinbach" <alf.p.steinbach+usenet@gmail.com>: Mar 29 04:17PM +0200

On 29.03.2016 15:48, Mr Flibble wrote:
>> the standard library wouldn't actually lose much if it didn't have it.
 
> Absolute bollocks mate; I use std::list slightly more than I use
> std::set in my codebase: not all lists need to be ordered.
 
I would be interested in some concrete example. There is the splice
functionality, but for the general case it's O(n). Is it that you want
pointers and iterators to remain valid after insertions/deletions?
 
 
> Just because YOU are unfamiliar with the common use-cases for std::list
> does not mean that std::list is redundant.
 
True.
 
However I've been thinking of `std::list` as ~99.99% redundant for a
long time now, and I've not seen anyone arguing the opposite before now.
 
 
Cheers!,
 
- Alf
Geoff <geoff@invalid.invalid>: Mar 29 07:27AM -0700

On Tue, 29 Mar 2016 14:48:02 +0100, Mr Flibble <flibble@i42.co.uk>
wrote:
 
>std::set in my codebase: not all lists need to be ordered.
 
>Just because YOU are unfamiliar with the common use-cases for std::list
>does not mean that std::list is redundant.
 
OK, let's see what the debate group turns up on this:
 
I have a project for analysis of RF intermodulation products.
Currently, the project is fundamentally C but is in process of
conversion to C++, as if there is any value to that. Be that as it
may, I have global transmitter and receiver arrays defined as:
 
#define MAX_ELEMENTS 1000
 
double Tx[MAX_ELEMENTS + 1];
double Rx[MAX_ELEMENTS + 1];
 
Arrays are indexed in the analysis section in (horrific to BartC)
'for' loops involving up to five index variables. The marker for the
end of the data is 0 for serialization but is not used to control
analysis.
 
The number of active elements in the arrays are adjusted and tracked
as elements are added/deleted by the user. Elements are always
inserted at the end of the arrays and are sorted numerically only when
initiated by the user. Removal shifts the elements downward from the
removal point.
 
Elements are optionally sorted using
std::sort(std::begin(Tx), std::begin(Tx) + Ttotal);
std::sort(std::begin(Rx), std::begin(Rx) + Rtotal);
 
Question, in converting to C++, should the arrays be converted to a
standard container such as a list?
 
What are the advantages/disadvantages?
Mr Flibble <flibble@i42.co.uk>: Mar 29 03:28PM +0100

On 29/03/2016 15:17, Alf P. Steinbach wrote:
 
> I would be interested in some concrete example. There is the splice
> functionality, but for the general case it's O(n). Is it that you want
> pointers and iterators to remain valid after insertions/deletions?
 
Yes the primary reason I use std::list is for stable element references
and iterators: std::list elements have "identity" unlike std::vector
elements for example. Other reasons I use std::list are the occasional
use of the splice functionality and fast insert/erase in the middle
(assuming one already has an iterator to the middle).
 
/Flibble
Paavo Helde <myfirstname@osa.pri.ee>: Mar 29 06:36PM +0200

On 29.03.2016 16:27, Geoff wrote:
 
> Question, in converting to C++, should the arrays be converted to a
> standard container such as a list?
 
> What are the advantages/disadvantages?
 
If it works and there is no need to add features or change
functionality, then there is not much reason to convert anything.
 
OTOH, if there is a need to change any related functionality, then it
would probably be a good idea to change the arrays to a standard
container, just in order to simplify the changes and later maintenance.
 
The choice of the container depends on the needs: if performance is
critical, then probably std::vector will do. If performance is not
critical and you want to get rid of manual sort-trim step, then use
std::set which keeps the elements in sorted order all the time.
 
With std::vector, sorting and removing would then become:
 
std::sort(Tx.begin(), Tx.end());
Tx.resize(newsize);
 
Benefits:
- no arbitrary limit of 1000 elements
- no fear of buffer overrun
- no need to track the actual size in a separate variable
- elements may move in memory so there would be no temptation to
create long-living pointers to them and complicate the code.
- serialization terminator does not fit here any more and should be
thrown out of the array, making the code clearer.
 
Drawbacks:
- a couple of dynamic allocations in total (might cost some
microseconds).
- elements may move in memory so one cannot create long-living
pointers to them.
- serialization needs to be rewritten a bit to work without the
terminator.
 
std::list is useful only if the elements have identity (i.e. their
address is important) or you need things like splice. In this usage
scenario I do not see any need to use std::list, it would just combine
the worst properties of std::vector and std::set (for this scenario).
 
hth
Paavo
woodbrian77@gmail.com: Mar 29 09:42AM -0700

On Tuesday, March 29, 2016 at 7:33:45 AM UTC-5, David Brown wrote:
> inefficient. I am just pointing out that there are pros and cons to the
> various container types, and it is wrong to say that vector and deque
> are all that are ever needed.
 
After vector and deque I prefer to use non-standard containers.
One thing that's nice about writing on line services is you have
more flexibility in what you can use compared to some apps.
 
Brian
Ebenezer Enterprises
http://webEbenezer.net
Paavo Helde <myfirstname@osa.pri.ee>: Mar 29 07:02PM +0200

On 29.03.2016 9:16, David Brown wrote:
> std::list - simple and space efficient, but no random access
 
> std::vector - fast access, but space inefficient and high worst-case
> times for some operations
 
std::list allocates each element dynamically and adds forth-and-back
pointers. This would mean e.g. 32 bytes overhead for *each* element in
64-bit builds. I would not call this "space efficient" in any sense.
 
For example, a million-element container of doubles might take 40MB with
std::list and 12 MB with a 50% over-allocated std::vector.
 
(Not to speak about that the processing of compactly placed array in
std::vector would be many times faster than the scattered data in
std::list.)
 
Cheers
Paavo
woodbrian77@gmail.com: Mar 29 10:17AM -0700

On Tuesday, March 29, 2016 at 2:16:14 AM UTC-5, David Brown wrote:
 
> I don't remember making such a post - but I don't remember /not/ making
> such a post, so it is possible.
 
Now I'm wondering if it was Chris Vine. Sorry.
 
Brian
scott@slp53.sl.home (Scott Lurndal): Mar 29 05:42PM


>std::list allocates each element dynamically and adds forth-and-back
>pointers. This would mean e.g. 32 bytes overhead for *each* element in
>64-bit builds. I would not call this "space efficient" in any sense.
 
Surely 16 bytes (8 for flink, 8 for blink). If the element stored in the
list is 320 bytes, then it's not a significant overhead (just 5%).
 
 
>For example, a million-element container of doubles might take 40MB with
>std::list and 12 MB with a 50% over-allocated std::vector.
 
Contrived example? Why would you use a std::list for doubles?
Paavo Helde <myfirstname@osa.pri.ee>: Mar 29 08:57PM +0200

On 29.03.2016 19:42, Scott Lurndal wrote:
>> pointers. This would mean e.g. 32 bytes overhead for *each* element in
>> 64-bit builds. I would not call this "space efficient" in any sense.
 
> Surely 16 bytes (8 for flink, 8 for blink).
 
You are forgetting the dynamic allocation service block.
 
 
>> For example, a million-element container of doubles might take 40MB with
>> std::list and 12 MB with a 50% over-allocated std::vector.
 
> Contrived example? Why would you use a std::list for doubles?
 
I wouldn't, and this space inefficiency is one one of the reasons why
std::list of doubles would not make sense (the main reason would be of
course that a million of dynamic allocations would be quite slow).
 
I have not yet found a usage case for std::list. If I have identity
objects they are allocated dynamically and accessed via smartpointers,
and smartpointers are best held in a std::vector or std::deque, in my
(definitely limited) experience.
 
Cheers
Paavo
Juha Nieminen <nospam@thanks.invalid>: Mar 29 06:26AM

> So the compiler would need to include special code to check
> the English language preconditions of known functions in the
> special case that the compiler knows the values of the arguments.
 
What did you smoke before making that post?
 
--- news://freenews.netfront.net/ - complaints: news@netfront.net ---
Juha Nieminen <nospam@thanks.invalid>: Mar 29 06:30AM

> On 3/24/2016 4:08 PM, Ian Collins wrote:
 
>> This is one case where fixed-size arrays *are* the best fit!
 
> Perhaps. But, it's not very C++ish.
 
Why not? C++ is intended to provide higher-level abstract object-oriented
programming capabilities without compromising efficiency. Fixed-size
arrays are one of the most efficient data structures (way, way more
efficient than any std::string or std::vector, unless in the former
case your string is short enough to allow for short string optimization,
assuming the library implementation does that, which isn't guaranteed.)
 
Static arrays are an integral part of C++, and personally I prefer
them if they easily solve the problem in hand and can be used safely.
I would never resort to a dynamic data container if a static array
would do just fine.
 
--- news://freenews.netfront.net/ - complaints: news@netfront.net ---
James Moe <jimoeDESPAM@sohnen-moe.com>: Mar 28 05:52PM -0700

On 03/26/2016 05:46 PM, James Moe wrote:
> gcc v3.3.5
 
> When the exception is thrown "throw error_handler(pmmsend_rc)" the
> program aborts (SIGABRT).
 
After compiling with gcc v4.7.3, the abort stopped occurring.
 
--
James Moe
jmm-list at sohnen-moe dot com
You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page.
To unsubscribe from this group and stop receiving emails from it send an email to comp.lang.c+++unsubscribe@googlegroups.com.

No comments: