- "Linus Torvalds Was (Sorta) Wrong About C++" - 22 Updates
- "unnamed", "anonymous" array in C++ - 3 Updates
JiiPee <no@notvalid.com>: Mar 14 03:14PM On 13/03/2015 14:04, Wouter van Ooijen wrote: > someones subset, so to read and maintain code you will need to > understand the whole language. > Wouter oh yes, if other people are involved. But if i code myself. then... |
Richard Damon <Richard@Damon-Family.org>: Mar 13 11:03PM -0400 On 3/12/15 5:05 AM, Johannes Bauer wrote: > seriously? What is going on? In any case, after implementing stubs for > those, it links: 293 kB of binary! That is almost a FIVE FOLD increrase > in code side fo doing pretty much nothing. ... > Cheers, > Johannes This sounds like somebody provided a hosted library to the implementation when they should have provided a freestanding library. One issue with C++ is that a full feature version does add a lot of base complexity to the library to handle the complexity of locales. An implementation is allowed to decide to just support one locale, in which case many things simplify (especially in strings and I/O), and a freestanding version is allowed even more freedom as it doesn't even need to provide the hooks to fake the locale support. Your C environment is likely already using this fact to simplify the library. It really sounds like your C++ library is expecting to be part of a full featured hosted environment. |
BGB <cr88192@hotmail.com>: Mar 14 02:23PM -0500 On 3/12/2015 11:32 AM, Wouter van Ooijen wrote: > A modern jet fighter is (almost) impossible to fly by hand, and very > costly and complex to develop and produce. Does that mean that we'd be > better off fighting our enemies on bikes? this is an issue I have with most current non-C-based languages: too much of the whole "babies' first programming language" thing, and not nearly enough "good for doing actual programming in". even for loosely C based languages, Java went pretty far in that direction, and is in some ways a very poor language to try to "actually write code in" (as in, not just invoke some magic "do it for me" class from the endless expanse of a standard class library). C# is at least a little closer to being sane, but is rendered moot by not really being a usable option for really anything I am doing. so, I ended up with a few options in my case: mostly C, with some C++ and ASM thrown in; a custom scripting language (basically resembles ActionScript3), but is sufficiently heavy-weight to thus far only really be usable on PC style hardware (and in soft-real-time use cases, such as games or similar); a VM for running a C variant. the VM for running a C variant is more-or-less usable in some of my ARM-based projects (on ARM11 class hardware), but has little hope of being a viable option on a typical microcontroller (still too resource heavy, but could be useable maybe on a higher-end Cortex-M or similar). the language is basically C (most normal C code can be ported with little or no modification), but with a few minor tweaks and some aspects (in the parser/compiler/VM machinery) which are actually closer to C# in some areas, and it is not exactly C standards conformant (and drops a few rarely-used / legacy features). a hope is to eventually support a C++ subset with it, but this is not currently a high priority. another possibility (what I had originally imagined it for early on) would be a more performance-oriented version of the scripting language. it uses green-threading and mostly runs in time that would otherwise be spent waiting in the delay loop (however, the code does run a fair bit slower than native). there is some hackery in the interpreter to try to keep maximum running times within about 1us of the target (and typically is run up to 2us short of the target delay, with any remaining time spent it in the delay loop). technically, it is loosely similar to the Dalvik VM (statically typed 3-address bytecode), just more designed to run C. it is also able to (reasonably easily) call into and share data with native code. note that this was used mostly because traditional OS threading wasn't really an option in this case (for example, the Linux scheduler couldn't really get much more accurate than around 1ms or so), so pretty much needed to run everything single threaded in a custom scheduler for this stuff. or such... |
Wouter van Ooijen <wouter@voti.nl>: Mar 12 09:41AM +0100 Jorgen Grahn schreef op 12-Mar-15 om 8:22 AM: > choice for more tasks than you'd think. It's not just about resounce > usage: it's about things like static type checking and (for me) more > attention to proper error handling. I agree that Pythons duck-typing is a real PITA. But there are benefits: I like indent-equals-structure, and in my experinece it is much esasier to grab a library and do something simple with it in Python than in C++ (like I recently did: generate a my invoices in rtf, or grab the pictures from all pages liked from a certain webpage). And I can distribute the source and the customer can just run it with the appropriate Python implementation on his platform. But I'd never consider wrting let's say a compiler in Python. Wouter van Ooijen |
Cholo Lennon <chololennon@hotmail.com>: Mar 12 01:25PM -0300 On 03/12/2015 10:31 AM, Stefan Ram wrote: > So the programmers are to blame! > But what if the language is very difficult to be learned? > Then maybe, after all, it /is/ the language that is to be blamed! I'm not saying that you're wrong, indeed maybe you're right, but I've seen professional crappy code in all kind of language (Java, VBasic, (Object)Pascal, C/C++, bash, perl, (Visual)Foxpro/DBase, etc, etc)... -- Cholo Lennon Bs.As. ARG |
legalize+jeeves@mail.xmission.com (Richard): Mar 12 06:39PM [Please do not mail me a copy of your followup] Johannes Bauer <dfnsonfsduifb@gmx.de> spake the secret code >All Linus is saying is that he WON'T. Judging from the amount of >extrodinarily shitty C++ code I have seen in my life, I kind of >understand why. The vast overwhelming majority of shitty C++ code I've seen is written by people who keep writing C code and calling it C++. -- "The Direct3D Graphics Pipeline" free book <http://tinyurl.com/d3d-pipeline> The Computer Graphics Museum <http://computergraphicsmuseum.org> The Terminals Wiki <http://terminals.classiccmp.org> Legalize Adulthood! (my blog) <http://legalizeadulthood.wordpress.com> |
Jorgen Grahn <grahn+nntp@snipabacken.se>: Mar 14 01:22PM On Thu, 2015-03-12, Johannes Bauer wrote: > On 11.03.2015 19:37, Wouter van Ooijen wrote: ... >> to its C legacy, I am not gona take such insults from a C lover! > That is definitely false, too. The parts that suck about C++ when people > misuse it are exclusively "new" features. You mean "new" as in "new compared to C 25 years ago". IMO it's the C features, but also tbe early 1990s fads that suck. People squeezing in virtual inheritance everywhere, creating a unique kind of spaghetti. At the same time, without taking things like value semantics, STL and RAII into account. > Template metaprogramming (oh the horrors), I might agree, if I'd encountered code which used it incorrectly. Used wisely though (by someone smarter than me) it's a big asset. > deeply nested weird inheritance hierarchies Not really new but 20 years old and now fortunately unfashionable; see above. > and use of the > STL which is unsuited for LOTS of tasks (space constrained, like > embedded for example). From where I'm looking, unsuited for surprisingly few tasks. And every day, those tasks are getting fewer. You're really complaining about the container part of the STL I guess, because the algorithms carry no overhead. It would be stupid and pointless to use C qsort() or memcpy() when you have std::sort and std::copy. > those features. As a "C with benefits", not as C++. You won't see any > inheritance at all in my embedded C++ code (you will see interfaces, > though). I don't see a reason for a ban on inheritance -- it's just that I rarely find a good reason to use it. Don't confuse these: - My design doesn't need to use inheritance. - Some people overuse inheritance in ways which make no sense at all. - Inheritance sucks. And, I see no conflict between enbedded programming and inheritance. You may create twenty levels deep inheritance hierarchies, and not waste a clock cycle or a word. > My hope is that at least strongly typed enums ("class enums") will go > into C at some point, that would be super cool. I think C11 already as > static assertions, so that's definitely a plus. It's a false hope. What you really want, I thinkm is to avoid interaction with weird C++ programmers, but then you have to interact with weird C programmers instead ... and they seem to typically refuse to learn anything new, or use anything more recent than ANSI C from a quarter of a century ago. Been there, tried that, failed to convince more than two or three people that languages have improved since they were in diapers in the 1980s. /Jorgen -- // Jorgen Grahn <grahn@ Oo o. . . \X/ snipabacken.se> O o . |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 13 11:04AM +0100 On 12.03.2015 23:43, Christopher Pisz wrote: >> Or any STL. Or any more advanced C++ features. Yup, that is what I'm >> doing. > Common C programmer mentality. Always a fail. I gave an example where including a *single* reference to std::string made the program 4 times (!) the size of the controller ROM I'm using. It just does not fit in the chip. Don't know why this is so hard to understand to your "mentality", but it's not a "fail" on my part, but a hard, physical constraint. > Who are these deluded people that think they can do better than all the > intelligent minds combined that went into coming up with the STL, not to > mention the millions using it? Why do you think I claim I can come up with more "intelligent" code? I never claimed that and that you seem to suggest I did means the deluded one here is you. I don't claim to be able to write better code than the STL. I claim to be able to write code that ACTUALLY WORKS in my scenario where any single reference to STL functionality blows the code so out of proportion that it becomes LITTERALLY unusable on constrained hardware. > So, if I load up your projects, do I get to experience the joys of Don't worry, you never will "load up my projects". Because nobody would hire someone with such a deluded mindset to do an embedded programmer's job. And because if they did they would fire you on the first day when you come back from a task and say "it can't be done, we need different hardware". > "faster, better" strings that aren't faster or better at all, 6000 line > header files full of typedef-ed types, and a plethora of other > constructs, because you are better than everyone else? So much anger and so much ignorance in your words. What a pity. > Been there, done that on several projects that in the end went belly-up, > all due to some legacy C programmer, that refused to modernize, that > they wouldn't get rid of. Life must be hard for you. I truly hope you find someone you can cry your sorrows to. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
JiiPee <no@notvalid.com>: Mar 14 03:19PM On 13/03/2015 15:19, Johannes Bauer wrote: > programmers, and were ACTUALLY just coding C. Because "real" C++ > programmers do everything right, of course. > The problem is not that C++ is a bad language, it's a fantastic one. yes > it's much harder (but sure enough still happens on a daily basis). > That said, both languages are nothing for beginners. I can only shake my > head when people suggest C++ as a first language. kind of agree.... but, I think it depends on how quickly and well the student is able to learn. For example I was a quick learner when I started C/C++. In couple of months I was doing pretty complex programs. So I do not think that at that moment I would have had any problems somebody teaching me C++, even new things. It depends on how motivated and skillfull one is. But for a class of students, yes I agree C++ would not be the best maybe to start with. |
Wouter van Ooijen <wouter@voti.nl>: Mar 12 02:47PM +0100 Stefan Ram schreef op 12-Mar-15 om 2:31 PM: > education managers who choose C++ for pupils and students as > a first language to gain some programming experience when > they only have little time for that class.) Now hat's one I can agree on. If you want someone to have a quick programming experience, throw out all the Pascal and C derivates, and go with Python. Wouter |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 12 08:12PM +0100 On 12.03.2015 15:17, Luca Risolia wrote: > Nonsense. Try to implement all the functionalities of a std::string in C > and then you will see what you are talking about. To be honest I don't give a shit fuck what std::string COULD do. For all I know it provides a freaking list-interpreter on top of it all. The point is: I don't want to pay for what I don't use. I only want code for the stuff I need, not for what I MIGHT need. Sure, if I want to fly to the moon I might need very large booster rockets and a spaceship. But if I only want to go shopping my car will do just fine. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 12 09:10PM +0100 On 12.03.2015 20:36, Luca Risolia wrote: >> The point is: I don't want to pay for what I don't use. I only want code >> for the stuff I need, not for what I MIGHT need. > This does make C a better language, as you seemed to imply. I most certainly did not imply that C is a better language. Both languages are merely tools for different jobs. Not every tool suits every job. > Just don't > use std::string or streams in that case. Or any STL. Or any more advanced C++ features. Yup, that is what I'm doing. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
Wouter van Ooijen <wouter@voti.nl>: Mar 11 09:10PM +0100 JiiPee schreef op 11-Mar-15 om 8:32 PM: > change their C-code to C++. Its a music hardware which has only like 64K > memory. He wanted to save every byte. I first said "well, I would use > std::vector.... lets install std stuff". Why use a std::vector, which is a flexible array, if you want a fixed array? You can't blame the language (or rather, the library) for giving you options that are wrong for your situation. > Although now as am thinking, he created many other classes and objects > willingly, surely they also take extra memory... so its not so obvious > which is best. A class does not need to take more memory than the data it stores and the code that implemnts its methods. Wouter van Ooijen |
Wouter van Ooijen <wouter@voti.nl>: Mar 12 11:14AM +0100 Stefan Ram schreef op 12-Mar-15 om 10:40 AM: > programming in C++. He should have come across texts such as (quote): > »Most of the standard and third-party C++ libraries don't > adhere to these limitatins, so they are not useable.« Hey, that's my text, with misspellings and all :) > (German-language) C++ tutorial has a small lesson on > embedded programming containing this advice: > · avoid exceptions (code size) IMO exceptions are not that bad (and often better than the alternatives), if the common implementations would avoid dragging in dynamic memory. More at http://www.voti.nl/blog/?p=40 > · possibly avoid dynamic memory IMO this is the root problem. Dynamic memory is a defense-in-depth, a flexible response, a way to use an unknown amount of resource (memory), at the cost of some unpredicatbility (both in timing and in 'will the memory be available' and what to do when it is not). For a (very) small system those costs often outweight the advantages. > · possibly avoid virtual classes and templates I used to agree on virtual classes, but the upcoming devirtualization might change that completely. I like to use the mechanism, but I hate the cost (it is an optimization and dead-code-removal barrier). When I can have it without the costs: jummy! For me templates are essential. > · possibly avoid iostreams Avoid the current implementations. At a high level I like the operator<< syntax. > · possibly avoid RTTI, dynamic_cast and the container library IMO RTTI and dynamic_cast should be avoided anyway. > · possibly avoid temporary objects I would not know why. Wouter van Ooijen |
Luca Risolia <luca.risolia@linux-projects.org>: Mar 12 03:17PM +0100 On 12/03/2015 10:23, Johannes Bauer wrote: > resolved at compile-time) and what will introduce tons of overhead. By > contrast in a language as simple as C it is simply not possible to > intruduce such overhead by seemingly subtle changes in code. Nonsense. Try to implement all the functionalities of a std::string in C and then you will see what you are talking about. |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 12 10:23AM +0100 On 12.03.2015 10:18, Stefan Ram wrote: >> And for resource constrained systems, templates are the aboulte worst > Templates fit small-memory systems well when the > programmer is knowing what he is doing. But this is exactly the criticism about C++ in the original article: You absolutely *have* to know the implications of what you're doing in detail or you can introduce a vast array of problems by a single line or reference (like std::string). It's not transparent by any means what is efficient (and can maybe even resolved at compile-time) and what will introduce tons of overhead. By contrast in a language as simple as C it is simply not possible to intruduce such overhead by seemingly subtle changes in code. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
Jorgen Grahn <grahn+nntp@snipabacken.se>: Mar 12 07:32PM On Thu, 2015-03-12, Wouter van Ooijen wrote: > I agree that Pythons duck-typing is a real PITA. But there are benefits: > I like indent-equals-structure, and in my experinece it is much esasier > to grab a library and do something simple with it in Python than in C++ Yes. That could and should be improved in C++. Sometimes I get the feeling C++ libraries (like Boost) are written for particle physicists and people doing their own nuclear test simulations. Flawless but complex. A Python library is often not designed to last a century, have maximum performance or cover exactly all uses. So the Python libraries are more practical and easier to use. Perhaps the C compatibility of C++ has a drawback here. You can catch both the C and C++ audience by making your library have a C interface. So I guess I'm arguing it's more a question of culture than of language support. /Jorgen -- // Jorgen Grahn <grahn@ Oo o. . . \X/ snipabacken.se> O o . |
Wouter van Ooijen <wouter@voti.nl>: Mar 12 11:00AM +0100 Johannes Bauer schreef op 12-Mar-15 om 10:05 AM: > results, the results are really amazing. > But because of the debugging and readability issues. Oh and the horrors > of compiler errors with templates. It's dreadful. As with all features, it is important how they are used. When I write a class template, the very first lines check the classm arguments for suitability. The error messages are still not as clear as they are for a straight error in your code, but you only need to look at the first three error lines (the assert failure, dismiss the location of that assert, and the next line is the template invocation). > And for resource constrained systems, templates are the aboulte worst > because they introduce a TON of code duplication. Again, not when used correctly. I mostly use templates for very thing Adapters, where inlining saves more code than the duplaction costs. But in other situations I derive (or implemnet) the template class from/on-top-of a not-template base class. I guess this pattern has a name, but I don't know it. > I'm not sure what > systems you're referring to, but I'm wokring on Cortex-M > microcontrollers. So let's say a Cortex-M0 with 64k flash. That's a *large* chip in my book. I mostly use LPC1114 level chips. > I dare to to > use one, only a single one of those templated STL library functions and > your ROM will be full already. STL offers run-time flexibility in the size of what you are storing/handling/computing. If you need that flexibility, OK, but most small systems don't: they have a fixed-sized task that must be done, no matter what. Doing more is not a plus. So the STL approach as it is *implemented* in the current STL is totally unsuited for (very) small systems. But some of the basic principles could be very usefull, if implemented appropriately (fixed-maximum-size containers, allocated without heap). > seriously? What is going on? In any case, after implementing stubs for > those, it links: 293 kB of binary! That is almost a FIVE FOLD increrase > in code side fo doing pretty much nothing. I agree the the implementation of std:ostream *sucks* for using on small systems. But I do like the opeartor<< approach, so I have been struggling to find alternatives. So far no real solution, but someday... > useless code that is pulled in actually did shock me. And I double > checked to see if my measurement was correct, and it was. The code is > full of monstrosities like When you switch to the desktop-mindset (memory is cheap, but only cache is fast, average performance is what counts, and better be safe (catch exceptions) that sorry) the design choices that produce this result are not that weird. It is just that small embedded systems require a very different mindset. > Sounds very interesting. Do you have any pointers, I wasn't aware of > that feature. It sounds like something like that could be very useful > for very small systems indeed. No, I have not found much beyond the very technical descriptions of how it is implemented. I use the latest ARM GCC from https://launchpad.net/gcc-arm-embedded , with -fdevirtualize. I compile 'all-at-one', which might or might not be needed. I did a talk on Meetting C++ about my previous approach, based purely on templates called "objects? no thanks": http://meetingcpp.com/index.php/tv14/items/14.html IMO C++ is a wonderfull language for developing code (especially re-useable code) for very small systems, but is difficult to find the correct subset of the language, libraries and programming paradigms (juk, I hate that word) because 1. the vast majority of C++ users work in different fields and hence have very differnt priorities 2. many very-small-systems developers don't appreciate the advantages of C++ and are scared away by the horror stories of linking in large libraries. Wouter van Ooijen |
David Brown <david.brown@hesbynett.no>: Mar 13 10:03AM +0100 On 12/03/15 17:33, JiiPee wrote: >> go with Python. >> Wouter > not Visual Basic?? If by "quick programming experience" you mean "try being a professional programmer for a few weeks, then switch career paths to being a crash test dummy because it's less painful", then Visual Basic is excellent at finishing your programming experience quickly :-) (I learned to program in Basic, on a wide variety of home computers. VB is not actually that bad, as long as you are only doing very small and simple programs, and don't presume that you are learning more than baby steps of programming. But while I'd not recommend anyone to bother learning VB, since there are better alternatives, if the choice was only C++ or VB for first language, then VB is a better choice.) |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 12 11:02PM +0100 On 12.03.2015 22:24, Paavo Helde wrote: >> Which errors do you have in mind that C doesn't catch at compile time >> but C++ does? > Are you kidding? LOL, no but with your examples I'm sure YOU are kidding: > int n = 100; > long* p = malloc(n); > p[n-1] = 42; std::vector<int> x; x.push_back(123); std::cerr << x[-1] << std::endl; Totally caught at compile time *cough* NOT > enum b { b1, b2 }; > void g(enum b arg) {} > g(a1); Yup, that one is legit. It was also the example I explicitly gave. > foo(42); > ok, the latter is a linker error in C++, not compile time, but still much > better than a random error on random architecture Oh, that is indeed neat! Nice. > The list goes on, but the main point is that in normal C++ one does not > use known error-prone constructs like C-style casts, Haha, sure, it does the super easy thing of having a HANDFULL of different cast operators which makes it very easy and intuitive to use. > so more errors are > caught in compile-time even if the corresponding C-style code would > compile without errors in C++. Well to be honest, you only came up with one example that was news to me. The vector thing isn't caught at compile time AT ALL, the enum example was the example I gave in the first place and the last example is indeed neat. My C compiler could and shold do that, too. I wish it did. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 12 10:05AM +0100 On 12.03.2015 09:37, Wouter van Ooijen wrote: > Template programming is one of the things in C++ I like most, and IMO it > is one of the most powerfull tools in writing re-useable code for very > resource constrained systems. Ooooh I think template programming is dreadful. Not because of the results, the results are really amazing. But because of the debugging and readability issues. Oh and the horrors of compiler errors with templates. It's dreadful. And, since you said you think duck-typing is annoying in Python, you get similar results in C++ templating code: You can introduce type errors that aren't ever reported until someone changes something about the code that uses templating when all the sudden everything just blows up. It is annoying. And for resource constrained systems, templates are the aboulte worst because they introduce a TON of code duplication. I'm not sure what systems you're referring to, but I'm wokring on Cortex-M microcontrollers. So let's say a Cortex-M0 with 64k flash. I dare to to use one, only a single one of those templated STL library functions and your ROM will be full already. > the language) are unsuited to programming very small systems. But even > with those features discarded I still like C++ better for such work than > anything else. I just tested it, have a project here that does a ton of work on a Cortex-M3, compiled in C. 62k of code, with many parts of the C standard library in there (printf and friends). Now I added one single file: #include <string> #include <iostream> void foo(const std::string &moo) { std::cout << moo; } And linked the whole thing again with g++. First observation: New syscalls are drawn in. kill, getpid and open. What? kill and getpid, seriously? What is going on? In any case, after implementing stubs for those, it links: 293 kB of binary! That is almost a FIVE FOLD increrase in code side fo doing pretty much nothing. I must admit that I expected horrible results, but the sheer amount of useless code that is pulled in actually did shock me. And I double checked to see if my measurement was correct, and it was. The code is full of monstrosities like 800f324: d809 bhi.n 800f33a <std::istreambuf_iterator<wchar_t, std::char_traits<wchar_t> > std::num_get<wchar_t, std::istreambuf_iterator<wchar_t, std::char_traits<wchar_t> > >::_M_extract_int<unsigned int>(std::istreambuf_iterator<wchar_t, std::char_traits<wchar_t> >, std::istreambuf_iterator<wchar_t, std::char_traits<wchar_t> >, std::ios_base&, std::_Ios_Iostate&, unsigned int&) const+0x1be> And I guess if for every variation of every parameter those compiled template binaries are duplicated, then, yes, you get a fivefold increase in code size. > the results. In the appropriate situations (where direct calls would be > using in C) the compiler eliminates all VFT's and can even be persuated > to inline the calls. Sounds very interesting. Do you have any pointers, I wasn't aware of that feature. It sounds like something like that could be very useful for very small systems indeed. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
Johannes Bauer <dfnsonfsduifb@gmx.de>: Mar 12 09:12PM +0100 On 12.03.2015 20:38, Christopher Pisz wrote: > Is catching errors at compile time rather than runtime "overhead" too? > I bet maintenance cost > overhead. Which errors do you have in mind that C doesn't catch at compile time but C++ does? Off-hand I can only think of strongly typed enums which C++ has and C doesn't. Cheers, Johannes -- >> Wo hattest Du das Beben nochmal GENAU vorhergesagt? > Zumindest nicht öffentlich! Ah, der neueste und bis heute genialste Streich unsere großen Kosmologen: Die Geheim-Vorhersage. - Karl Kaos über Rüdiger Thomas in dsa <hidbv3$om2$1@speranza.aioe.org> |
ram@zedat.fu-berlin.de (Stefan Ram): Mar 19 11:53PM >print((const char* const []){word1, word2, word3}); Compound literals are not part of C++ IIRC. >Could it be written in a more idiomatic way, or is it already ok? A program with the behavior that possibly was intended is: #include <iostream> int main(){ ::std::cout << "C++\nis\great\n"; } . |
ram@zedat.fu-berlin.de (Stefan Ram): Mar 13 03:00PM >Do note that VB.net is a half decent language, but VB up till version 6 >or most forms of VBA are so buggy and badly documented, I would advice to >learn C++ instead. I give courses in both VBA and C++ (both during the recent months). I teach adults without any required previous knowledge. Unfortunately, for many of them, learning programming is very hard. I can observe that, with VBA, they are better motivated (because they already use Office and know what they want to do with VBA) and they get more sense of achievement earlier, because at the end of a short beginner's class, they already can write GUI software in VBA. In C++, I do not teach GUI programming because I only teach the language and its standard library in my courses, and, moreover, GUI programming would still be too difficult for the beginner's course in C++. So for a beginner with average previous knowledge, I do recommend VBA. VBA C++ Yes No Retrieve Web pages with the standard library (WTSL) Yes No Read Directories of the file system WTSL Yes No Build GUIs WTSL Yes No Easily work with text documents WTSL Yes No Easily work with spreadsheets WTSL Yes No A standard IDE with a debugger Yes No A simple REPL (immediate windows) VBA (immediate window) ? "Hello, World" C++ #include <iostream> int main() { ::std::cout << "Hello, World" << "\n"; } I have the impression that the majority of the students in my VBA course at the end of the beginner's course already have reached the point where they can use the language productively in their lives, while some students in my C++ courses might never reach that point. |
ram@zedat.fu-berlin.de (Stefan Ram): Mar 14 06:03PM Supersedes: <inheritance-20150314185313@ram.dialup.fu-berlin.de> [new quotation from Jorgen][modified diagram] >I don't see a reason for a ban on inheritance -- it's just that I >rarely find a good reason to use it. Is anything wrong with: ostream ^ | .--------------'--------------. | | ostringstream ofstream ? |
You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page. To unsubscribe from this group and stop receiving emails from it send an email to comp.lang.c+++unsubscribe@googlegroups.com. |
No comments:
Post a Comment