Wednesday, September 21, 2022

Digest for comp.programming.threads@googlegroups.com - 5 updates in 5 topics

Amine Moulay Ramdane <aminer68@gmail.com>: Sep 20 07:46PM -0700

Hello,
 
 
 
More of my philosophy about the accuracy and about egoism and more of my thoughts..
 
I am a white arab, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
 
I think i am highly smart since I have passed two certified IQ tests
and i have scored "above" 115 IQ, so as you are noticing that i am
making you understand that Adam Smith has made a mistake,
since egoism of a society or a zone like European Union makes you
become convinced that you have to be collaboration and solidarity
between people of a society or a zone like European union,
so then from this convincing you become sincerity of being
collaboration and solidarity, so it becomes the engine too as i am
explaining below, but notice with me that so that to be efficient
you have to be convinced, so you have to educate people in an efficient
way so that they be convinced of the being collaboration and solidarity,
and i also say that the very important thing is not that you have just to score, but you have to score "precisely" and "efficiently", so education has to be "precise" and "efficient". So it is why i say in french: "Je trouve que la précision est si importante, alors apprenons à être précis", and it means in english: "I find accuracy so important, so let's learn to be accurate", so i invite you to read my previous thoughts so that to understand my views:
 
More of my philosophy about the philosopher and economist Adam Smith and more of my thoughts..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i think the the philosopher and economist that we call Adam Smith that is the father of economic Liberalism was pessimistic like by logical analogy of the pessimistic philosopher Arthur Schopenhauer, since he has said that humans are
individually selfish and egoistic, and he has said that: "Human egoism is the engine of the prosperity and happiness of nations", but i understand that egoism is an engine and how it brings the value, but i say that egoism is not the only engine, since for example egoism of a society or a zone like European Union can rapidly bring collaboration and solidarity between the people of a society or a zone like European Union, so this collaboration and solidarity between the people of society or a zone like European Union become the engine too, since i say that egoism can not be the only engine, since the goal is not the only value, since the value is also the "path" to the goal, so then i say that this collaboration and solidarity that is brought rapidly by egoism of a society or a zone like European Union gives value to the path to the goal, i mean that it also stabilizes the system and makes it order and makes it much less violent, and look also at the following article that talks about a study that says that humans are naturally and individually selfish and egoistic:
 
https://thebaynet.com/humans-are-naturally-selfish-study-finds-html/
 
 
So i think that the above article and study is also making the same
mistake as the philosopher and economist Adam Smith.
 
 
More of my philosophy about specialization in Economic integration and about productivity and quality and and more of my thoughts..
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and here is what i have just said about
Economic integration and research & development (R&D) investment:
 
"Economic integration, or regional integration, is an agreement among nations to reduce or eliminate trade barriers and agree on fiscal policies. The European Union, for example, represents a complete economic integration. Strict nationalists may oppose economic integration due to concerns over a loss of sovereignty, so i say that we have not just to look at economic integration , but we have to also look at how much you spend in research & development (R&D), so i think that African leaders committed to investing 1% of GDP in Research and development. With Africa continent's estimated GDP currently at US$2.7 trillion, investment in science and technology is meant to be at US$27 billion, but i think that Africa is very much behind even China in research & development (R&D) investment, since China spend US$514.798 billion in research & development (R&D), and USA spend US$612.714 billion in research & development (R&D), and European Union spend around US$413 billion in research & development (R&D) and India spend US$158.691 billion in research & development (R&D), but i think that the problematic of Africa is that it is "much" less economically integrated than European Union, so then we have also to look at the economic integration of Arab countries and south american countries and such, since the economic integration is so important."
 
 
And so that to be more precise, so i say that Economic integration also benefits from the specialization that talked about it the philosopher and economist Adam Smith the father of economic Liberalism, since when in economy or in a job you for example specialize in what you do better, it has the tendency to higher both productivity and quality, and of course when you higher productivity and quality it also has the tendency to create more jobs, and it is the same for Artificial intelligence and automation, they also higher productivity and quality, so they create more jobs, and since of course you have to think capitalism as not a zero-sum game, and capitalism is not a zero-sum game since for example more wealth is always being produced and since we take natural resources and transform them into the goods that we want. We grow more food, weave cloth, build homes. We produce a larger supply of goods, and new goods which we never had before. Production increases the quantity of real, physical wealth, so then i can say that Economic integration is beneficial in many ways, as it allows countries to specialize and trade without government interference, which can benefit all economies. It results in a reduction of costs and ultimately an increase in overall wealth, and economic integration can facilitate access to a larger consumer base, a greater pool of qualified workers, additional sources of financing, and new technologies.
 
And i have also just talked more about the very important thing that we call specialization , and here is what i have just said about it:
 
More of my philosophy about education and specialization and more of my thoughts..
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so as you notice that i have not scored
115 IQ , but "above" 115 IQ, and now i will give you an IQ test by
inviting you to look at the following video, since i have just looked
at it:
 
 
Why FREE college HURTS the poor - VisualEconomik EN
 
https://www.youtube.com/watch?v=vZvxnk1boAE
 
 
So i am discovering smart patterns with my fluid intelligence in
the above video, and it is that the person that is talking in the above video is not seeing with his fluid intelligence, as i am seeing it with my fluid intelligence, that the school is not specialized for the low IQs that are around 25%, i mean that the schools and the courses of the schools are tuned for the average IQ and to a certain level for the high IQ, and not for the low IQs that are around 25%, so then to solve the problem we have to also "specialize" the school for the low IQs, and it means to have schools for low IQs that learn the low IQs with ease, and it is also valid for the free schools, and it is the way i have just spoken about IQ and specialization by saying the following:
 
 
More of my philosophy about IQ tests and more of my thoughts..
 
 
I think i am highly smart, and I have passed two certified IQ tests and i have scored above 115 IQ, but i have just passed more and more IQ tests, and i have just noticed that the manner in wich we test with IQ tests is not correct, since in an IQ test you can be more specialized and above average in one subtest of intelligence than in another subtest of intelligence inside an IQ test, since IQ tests test for many kind of intelligences such as the spatial and speed and calculations and logical intelligence etc., so i think that you can be really above average in logical intelligence, as i am really above average in logical intelligence, but at the same time you can be below average in calculations and/or spatial.., so since an IQ test doesn't test for this kind of specializations of intelligence, so i think it is not good, since testing for this kind specializations in intelligence is really important so that to be efficient by knowing the strong advantages of this or that person in every types of intelligences. And about the importance of specialization, read carefully my following thought about it:
 
 
More of my philosophy about specialization and about efficiency and productivity..
 
The previous CEO Larry Culp of General Electric and the architect of a strategy that represented a new turning point in the world corporate strategies, Larry Culp's strategy was to divide the company according to its activities. Something like we are better of alone, seperately and
focused on each one's own activity, than together in a large
conglomerate. And it is a move from integration to specialization.
You see it is thought that a company always gains economies of scale
as it grows, but this is not necessarily the case, since as the company
gains in size - especially if it engages in many activities - it
also generates its own bureaucracy, with all that entails in term
of cost and efficiency. And not only that, it is also often the case
that by bringing together very different activities, strategic focus is lost and decision-making is diluted, so that in the end no one ends up
taking responsability, it doesn't always happen, but this reasons are
basically what is driving this increasing specialization. So i invite to look at the following video so that to understand more about it:
 
The decline of industrial icon of the US - VisualPolitik EN
 
https://www.youtube.com/watch?v=-hqwYxFCY-k
 
 
And here is my previous thoughts about specialization and productivity so that to understand much more:
 
More about the Japanese Ikigai and about productivity and more of my thoughts..
 
Read the following interesting article about Japanese Ikigai:
 
The More People With Purpose, the Better the World Will Be
 
https://singularityhub.com/2018/06/15/the-more-people-with-purpose-the-better-the-world-will-be/
 
I think i am highly smart, so i say that the Japanese Ikigai is like a Japanese philosophy that is like the right combination or "balance" of passion, vocation, and mission, and Ikigai and MTP, as concepts, urge us to align our passions with a mission to better the world, but i think that Japanese Ikiai is a also smart since it gets the "passion" from the "mission", since the mission is also the engine, so you have to align the passion with the mission of the country or the global world so that to be efficient, and Japanese Ikigai is also smart since so that to higher productivity and be efficient, you have to "specialize" in doing a job, but so that to higher more productivity and be more efficient you can also specialize in what you do "better", and it is what is doing Japanese Ikigai, since i think that in Japanese Ikigai, being the passion permits to make you specialized in a job in what you do better, and here is what i have just smartly said about productivity:
 
I think i am highly smart, and i have passed two certified IQ tests and i have scored above 115 IQ, and i will now talk about another important idea of Adam Smith the father of economic Liberalism, and it is about "specialization" in an economic system, since i say that in an economic system we have to be specialized in doing a job so that to be efficient and productive, but not only that, but we have to specialize in doing a job in what we do better so that to be even more efficient and productive, and we have to minimize at best the idle time or the wasting of time doing a job, since i can also say that this average idle time or wasting time of the workers working in parallel can be converted to a contention like in parallel programming, so you have to minimize it at best, and you have to minimize at best the coherency like in parallel programming so that to scale much better, and of course all this can create an economy of scale, and also i invite you to read my following smart and interesting thoughts about scalability of productivity:
 
I will talk about following thoughts from the following PhD computer scientist:
 
https://lemire.me/blog/about-me/
 
Read more carefully here his thoughts about productivity:
 
https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/
 
And i think he is making a mistake in his above webpage about productivity:
 
Since we have that Productivity = Output/Input
 
But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law).
 
And it looks like the following:
 
About parallelism and about Gustafson's Law..
 
Gustafson's Law:
 
• If you increase the amount of work done by each parallel
task then the serial component will not dominate
• Increase the problem size to maintain scaling
• Can do this by adding extra complexity or increasing the overall
problem size
 
Scaling is important, as the more a code scales the larger a machine it
can take advantage of:
 
• can consider weak and strong scaling
• in practice, overheads limit the scalability of real parallel programs
• Amdahl's law models these in terms of serial and parallel fractions
• larger problems generally scale better: Gustafson's law
 
 
Load balance is also a crucial factor.
 
 
More of my philosophy about HP and about the Tandem team and more of my thoughts..
 
 
I invite you to read the following interesting article so that
to notice how HP was smart by also acquiring Tandem Computers, Inc.
with there "NonStop" systems and by learning from the Tandem team
that has also Extended HP NonStop to x86 Server Platform, you can read about it in my below writing and you can read about Tandem Computers here: https://en.wikipedia.org/wiki/Tandem_Computers , so notice that Tandem Computers, Inc. was the dominant manufacturer of fault-tolerant computer systems for ATM networks, banks, stock exchanges, telephone switching centers, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss:
 
https://www.zdnet.com/article/tandem-returns-to-its-hp-roots/
 
More of my philosophy about HP "NonStop" to x86 Server Platform fault-tolerant computer systems and more..
 
Now HP to Extend HP NonStop to x86 Server Platform
 
HP announced in 2013 plans to extend its mission-critical HP NonStop technology to x86 server architecture, providing the 24/7 availability required in an always-on, globally connected world, and increasing customer choice.
 
Read the following to notice it:
 
https://www8.hp.com/us/en/hp-news/press-release.html?id=1519347#.YHSXT-hKiM8
 
And today HP provides HP NonStop to x86 Server Platform, and here is
an example, read here:
 
https://www.hpe.com/ca/en/pdfViewer.html?docId=4aa5-7443&parentPage=/ca/en/products/servers/mission-critical-servers/integrity-nonstop-systems&resourceTitle=HPE+NonStop+X+NS7+%E2%80%93+Redefining+continuous+availability+and+scalability+for+x86+data+sheet
 
So i think programming the HP NonStop for x86 is now compatible with x86 programming.
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
Amine Moulay Ramdane <aminer68@gmail.com>: Sep 20 06:15PM -0700

Hello,
 
 
 
More of my philosophy about the philosopher and economist Adam Smith and more of my thoughts..
 
I am a white arab, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and i think the the philosopher and economist that we call Adam Smith that is the father of economic Liberalism was pessimistic like by logical analogy of the pessimistic
philosopher Arthur Schopenhauer, since he has said that humans are
individuallyselfish and egoistic, and he has said that: "Human egoism is the engine of the prosperity and happiness of nations", but i understand that egoism is an engine and how it brings the value, but i say that egoism is not the only engine, since for example egoism of a society or a zone like European Union can rapidly bring collaboration and solidarity between the people of a society or a zone like European Union, so this collaboration and solidarity between the people of society or a zone like European Union become the engine too, since i say that egoism can not be the only engine, since the goal is not the only value, since the value is also the "path" to the goal, so then i say that this collaboration and solidarity that is brought rapidly by egoism of a society or a zone like European Union gives value to the path to the goal, i mean that it also stabilizes the system and makes it order and makes it much less violent, and look also at the following article that talks about a study that says that humans are naturally and individually selfish and egoistic:
 
https://thebaynet.com/humans-are-naturally-selfish-study-finds-html/
 
 
So i think that the above article is also making the same
mistake as the philosopher and economist Adam Smith.
 
 
More of my philosophy about specialization in Economic integration and about productivity and quality and and more of my thoughts..
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and here is what i have just said about
Economic integration and research & development (R&D) investment:
 
"Economic integration, or regional integration, is an agreement among nations to reduce or eliminate trade barriers and agree on fiscal policies. The European Union, for example, represents a complete economic integration. Strict nationalists may oppose economic integration due to concerns over a loss of sovereignty, so i say that we have not just to look at economic integration , but we have to also look at how much you spend in research & development (R&D), so i think that African leaders committed to investing 1% of GDP in Research and development. With Africa continent's estimated GDP currently at US$2.7 trillion, investment in science and technology is meant to be at US$27 billion, but i think that Africa is very much behind even China in research & development (R&D) investment, since China spend US$514.798 billion in research & development (R&D), and USA spend US$612.714 billion in research & development (R&D), and European Union spend around US$413 billion in research & development (R&D) and India spend US$158.691 billion in research & development (R&D), but i think that the problematic of Africa is that it is "much" less economically integrated than European Union, so then we have also to look at the economic integration of Arab countries and south american countries and such, since the economic integration is so important."
 
 
And so that to be more precise, so i say that Economic integration also benefits from the specialization that talked about it the philosopher and economist Adam Smith the father of economic Liberalism, since when in economy or in a job you for example specialize in what you do better, it has the tendency to higher both productivity and quality, and of course when you higher productivity and quality it also has the tendency to create more jobs, and it is the same for Artificial intelligence and automation, they also higher productivity and quality, so they create more jobs, and since of course you have to think capitalism as not a zero-sum game, and capitalism is not a zero-sum game since for example more wealth is always being produced and since we take natural resources and transform them into the goods that we want. We grow more food, weave cloth, build homes. We produce a larger supply of goods, and new goods which we never had before. Production increases the quantity of real, physical wealth, so then i can say that Economic integration is beneficial in many ways, as it allows countries to specialize and trade without government interference, which can benefit all economies. It results in a reduction of costs and ultimately an increase in overall wealth, and economic integration can facilitate access to a larger consumer base, a greater pool of qualified workers, additional sources of financing, and new technologies.
 
And i have also just talked more about the very important thing that we call specialization , and here is what i have just said about it:
 
More of my philosophy about education and specialization and more of my thoughts..
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so as you notice that i have not scored
115 IQ , but "above" 115 IQ, and now i will give you an IQ test by
inviting you to look at the following video, since i have just looked
at it:
 
 
Why FREE college HURTS the poor - VisualEconomik EN
 
https://www.youtube.com/watch?v=vZvxnk1boAE
 
 
So i am discovering smart patterns with my fluid intelligence in
the above video, and it is that the person that is talking in the above video is not seeing with his fluid intelligence, as i am seeing it with my fluid intelligence, that the school is not specialized for the low IQs that are around 25%, i mean that the schools and the courses of the schools are tuned for the average IQ and to a certain level for the high IQ, and not for the low IQs that are around 25%, so then to solve the problem we have to also "specialize" the school for the low IQs, and it means to have schools for low IQs that learn the low IQs with ease, and it is also valid for the free schools, and it is the way i have just spoken about IQ and specialization by saying the following:
 
 
More of my philosophy about IQ tests and more of my thoughts..
 
 
I think i am highly smart, and I have passed two certified IQ tests and i have scored above 115 IQ, but i have just passed more and more IQ tests, and i have just noticed that the manner in wich we test with IQ tests is not correct, since in an IQ test you can be more specialized and above average in one subtest of intelligence than in another subtest of intelligence inside an IQ test, since IQ tests test for many kind of intelligences such as the spatial and speed and calculations and logical intelligence etc., so i think that you can be really above average in logical intelligence, as i am really above average in logical intelligence, but at the same time you can be below average in calculations and/or spatial.., so since an IQ test doesn't test for this kind of specializations of intelligence, so i think it is not good, since testing for this kind specializations in intelligence is really important so that to be efficient by knowing the strong advantages of this or that person in every types of intelligences. And about the importance of specialization, read carefully my following thought about it:
 
 
More of my philosophy about specialization and about efficiency and productivity..
 
The previous CEO Larry Culp of General Electric and the architect of a strategy that represented a new turning point in the world corporate strategies, Larry Culp's strategy was to divide the company according to its activities. Something like we are better of alone, seperately and
focused on each one's own activity, than together in a large
conglomerate. And it is a move from integration to specialization.
You see it is thought that a company always gains economies of scale
as it grows, but this is not necessarily the case, since as the company
gains in size - especially if it engages in many activities - it
also generates its own bureaucracy, with all that entails in term
of cost and efficiency. And not only that, it is also often the case
that by bringing together very different activities, strategic focus is lost and decision-making is diluted, so that in the end no one ends up
taking responsability, it doesn't always happen, but this reasons are
basically what is driving this increasing specialization. So i invite to look at the following video so that to understand more about it:
 
The decline of industrial icon of the US - VisualPolitik EN
 
https://www.youtube.com/watch?v=-hqwYxFCY-k
 
 
And here is my previous thoughts about specialization and productivity so that to understand much more:
 
More about the Japanese Ikigai and about productivity and more of my thoughts..
 
Read the following interesting article about Japanese Ikigai:
 
The More People With Purpose, the Better the World Will Be
 
https://singularityhub.com/2018/06/15/the-more-people-with-purpose-the-better-the-world-will-be/
 
I think i am highly smart, so i say that the Japanese Ikigai is like a Japanese philosophy that is like the right combination or "balance" of passion, vocation, and mission, and Ikigai and MTP, as concepts, urge us to align our passions with a mission to better the world, but i think that Japanese Ikiai is a also smart since it gets the "passion" from the "mission", since the mission is also the engine, so you have to align the passion with the mission of the country or the global world so that to be efficient, and Japanese Ikigai is also smart since so that to higher productivity and be efficient, you have to "specialize" in doing a job, but so that to higher more productivity and be more efficient you can also specialize in what you do "better", and it is what is doing Japanese Ikigai, since i think that in Japanese Ikigai, being the passion permits to make you specialized in a job in what you do better, and here is what i have just smartly said about productivity:
 
I think i am highly smart, and i have passed two certified IQ tests and i have scored above 115 IQ, and i will now talk about another important idea of Adam Smith the father of economic Liberalism, and it is about "specialization" in an economic system, since i say that in an economic system we have to be specialized in doing a job so that to be efficient and productive, but not only that, but we have to specialize in doing a job in what we do better so that to be even more efficient and productive, and we have to minimize at best the idle time or the wasting of time doing a job, since i can also say that this average idle time or wasting time of the workers working in parallel can be converted to a contention like in parallel programming, so you have to minimize it at best, and you have to minimize at best the coherency like in parallel programming so that to scale much better, and of course all this can create an economy of scale, and also i invite you to read my following smart and interesting thoughts about scalability of productivity:
 
I will talk about following thoughts from the following PhD computer scientist:
 
https://lemire.me/blog/about-me/
 
Read more carefully here his thoughts about productivity:
 
https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/
 
And i think he is making a mistake in his above webpage about productivity:
 
Since we have that Productivity = Output/Input
 
But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law).
 
And it looks like the following:
 
About parallelism and about Gustafson's Law..
 
Gustafson's Law:
 
• If you increase the amount of work done by each parallel
task then the serial component will not dominate
• Increase the problem size to maintain scaling
• Can do this by adding extra complexity or increasing the overall
problem size
 
Scaling is important, as the more a code scales the larger a machine it
can take advantage of:
 
• can consider weak and strong scaling
• in practice, overheads limit the scalability of real parallel programs
• Amdahl's law models these in terms of serial and parallel fractions
• larger problems generally scale better: Gustafson's law
 
 
Load balance is also a crucial factor.
 
 
More of my philosophy about HP and about the Tandem team and more of my thoughts..
 
 
I invite you to read the following interesting article so that
to notice how HP was smart by also acquiring Tandem Computers, Inc.
with there "NonStop" systems and by learning from the Tandem team
that has also Extended HP NonStop to x86 Server Platform, you can read about it in my below writing and you can read about Tandem Computers here: https://en.wikipedia.org/wiki/Tandem_Computers , so notice that Tandem Computers, Inc. was the dominant manufacturer of fault-tolerant computer systems for ATM networks, banks, stock exchanges, telephone switching centers, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss:
 
https://www.zdnet.com/article/tandem-returns-to-its-hp-roots/
 
More of my philosophy about HP "NonStop" to x86 Server Platform fault-tolerant computer systems and more..
 
Now HP to Extend HP NonStop to x86 Server Platform
 
HP announced in 2013 plans to extend its mission-critical HP NonStop technology to x86 server architecture, providing the 24/7 availability required in an always-on, globally connected world, and increasing customer choice.
 
Read the following to notice it:
 
https://www8.hp.com/us/en/hp-news/press-release.html?id=1519347#.YHSXT-hKiM8
 
And today HP provides HP NonStop to x86 Server Platform, and here is
an example, read here:
 
https://www.hpe.com/ca/en/pdfViewer.html?docId=4aa5-7443&parentPage=/ca/en/products/servers/mission-critical-servers/integrity-nonstop-systems&resourceTitle=HPE+NonStop+X+NS7+%E2%80%93+Redefining+continuous+availability+and+scalability+for+x86+data+sheet
 
So i think programming the HP NonStop for x86 is now compatible with x86 programming.
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
 
More of my philosophy about stack allocation and more of my thoughts..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have just looked at the x64 assembler
of the C/C++ _alloca function that allocates size bytes of space from the Stack and it uses x64 assembler instructions to move RSP register and i think that it also aligns the address and it ensures that it doesn't go beyond the stack limit etc., and i have quickly understood the x64 assembler of it, and i invite you to look at it here:
 
64-bit _alloca. How to use from FPC and Delphi?
 
https://www.atelierweb.com/64-bit-_alloca-how-to-use-from-delphi/
 
 
But i think i am smart and i say that the benefit of using a stack comes mostly from "reusability" of the stack, i mean it is done this way
since you have for example from a thread to execute other functions or procedures and to exit from those functions of procedures and this exiting from those functions or procedures makes the memory of stack available again for "reusability", and it is why i think that using a dynamic allocated array as a stack is also useful since it also offers those benefits of reusability of the stack and i think that dynamic allocation of the array will not be expensive, so it is why i think i will
Amine Moulay Ramdane <aminer68@gmail.com>: Sep 20 04:05PM -0700

Hello,
 
 
 
 
 
More of my philosophy of the polynomial-time complexity of race detection and more of my thoughts..
 
I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have quickly understood how Rust
detects race conditions, but i think that a slew of
"partial order"-based methods have been proposed, whose
goal is to predict data races in polynomial time, but at the
cost of being incomplete and failing to detect data races in
"some" traces. These include algorithms based on the classical
happens-before partial order, and those based
on newer partial orders that improve the prediction of data
races over happens-before , so i think that we have to be optimistic
since read the following web page about the Sanitizers:
 
https://github.com/google/sanitizers
 
And notice carefully the ThreadSanitizer, so read carefully
the following paper about ThreadSanitizer:
 
https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/35604.pdf
 
 
And it says in the conclusion the following:
 
"ThreadSanitizer uses a new algorithm; it has several modes of operation, ranging from the most conservative mode (which has few false positives but also misses real races) to a very aggressive one (which
has more false positives but detects the largest number of
real races)."
 
So as you are noticing since the very agressive mode doesn't detect
all the data races, so it misses a really small number of real races , so it is like a very high probability of really detecting real races ,
and i think that you can also use my below methodology of using incrementally a model from the source code and using Spin model checker so that to higher even more the probability of detecting real races.
 
 
Read my previous thoughts:
 
More of my philosophy about race conditions and about composability and more of my thoughts..
 
I say that a model is a representation of something. It captures not all attributes of the represented thing, but rather only those seeming relevant. So my way of doing in software development in Delphi and Freepascal is also that i am using a "model" from the source code that i am executing in Spin model checking so that to detect race conditions, so i invite you to take a look at the following new tutorial that uses the powerful Spin tool:
 
https://mirrors.edge.kernel.org/pub/linux/kernel/people/paulmck/perfbook/perfbook.html
 
So you can for example install Spin model checker so that to detect race conditions, this is how you will get much more professional at detecting deadlocks and race conditions in parallel programming. And i invite you to look at the following video so that to know how to install Spin model checker on windows:
 
https://www.youtube.com/watch?v=MGzmtWi4Oq0
 
More of my philosophy about race detection and concurrency and more..
 
I have just looked quickly at different race detectors, and i think that
the Intel Thread Checker from Intel company from "USA" is also very good since the Intel Thread Checker needs to instrument either the C++ source code or the compiled binary to make every memory reference and every standard Win32 synchronization primitive observable, so this instrumentation from the source code is very good since it also permits me to port my scalable algorithms inventions by for example wrapping them in some native Windows synchronization APIs, and this instrumentation from the source code is also business friendly, so read about different race detectors and about Intel Thread Checker here:
 
https://docs.microsoft.com/en-us/archive/msdn-magazine/2008/june/tools-and-techniques-to-identify-concurrency-issues
 
So i think that the other race detectors of other programming languages have to provide this instrumentation from the source code as Intel Thread Checker from Intel company from "USA".
 
More of my philosophy about Rust and about memory models and about technology and more of my thoughts..
 
 
I think i am highly smart, and i say that the new programming language that we call Rust has an important problem, since read the following interesting article that says that atomic operations that have not correct memory ordering can still cause race conditions in safe code, this is why the suggestion made by the researchers is:
 
"Race detection techniques are needed for Rust, and they should focus on unsafe code and atomic operations in safe code."
 
 
Read more here:
 
https://www.i-programmer.info/news/98-languages/12552-is-rust-really-safe.html
 
 
More of my philosophy about programming languages about lock-based systems and more..
 
I think we have to be optimistic about lock-based systems, since race conditions detection can be done in polynomial-time, and you can notice it by reading the following paper:
 
https://arxiv.org/pdf/1901.08857.pdf
 
Or by reading the following paper:
 
https://books.google.ca/books?id=f5BXl6nRgAkC&pg=PA421&lpg=PA421&dq=race+condition+detection+and+polynomial+complexity&source=bl&ots=IvxkORGkQ9&sig=ACfU3U2x0fDnNLHP1Cjk5bD_fdJkmjZQsQ&hl=en&sa=X&ved=2ahUKEwjKoNvg0MP0AhWioXIEHRQsDJc4ChDoAXoECAwQAw#v=onepage&q=race%20condition%20detection%20and%20polynomial%20complexity&f=false
 
So i think we can continu to program in lock-based systems, and about
composability of lock-based systems, read my following thoughts about it it:
 
More of my philosophy about composability and about Haskell functional language and more..
 
I have just read quickly the following article about composability,
so i invite you to read it carefully:
 
https://bartoszmilewski.com/2014/06/09/the-functional-revolution-in-c/
 
I am not in accordance with the above article, and i think that the above scientist is programming in Haskell functional language and it is for him the way to composability, since he says that the way of functional programming like Haskell functional programming is the
the way that allows composability in presence of concurrency, but for him lock-based systems don't allow it, but i don't agree with him, and i will give you the logical proof of it, and here it is, read what is saying an article from ACM that was written by both Bryan M. Cantrill and Jeff Bonwick from Sun Microsystems:
 
You can read about Bryan M. Cantrill here:
 
https://en.wikipedia.org/wiki/Bryan_Cantrill
 
And you can read about Jeff Bonwick here:
 
https://en.wikipedia.org/wiki/Jeff_Bonwick
 
And here is what says the article about composability in the presence of concurrency of lock-based systems:
 
"Design your systems to be composable. Among the more galling claims of the detractors of lock-based systems is the notion that they are somehow uncomposable:
 
"Locks and condition variables do not support modular programming," reads one typically brazen claim, "building large programs by gluing together smaller programs[:] locks make this impossible."9 The claim, of course, is incorrect. For evidence one need only point at the composition of lock-based systems such as databases and operating systems into larger systems that remain entirely unaware of lower-level locking.
 
There are two ways to make lock-based systems completely composable, and each has its own place. First (and most obviously), one can make locking entirely internal to the subsystem. For example, in concurrent operating systems, control never returns to user level with in-kernel locks held; the locks used to implement the system itself are entirely behind the system call interface that constitutes the interface to the system. More generally, this model can work whenever a crisp interface exists between software components: as long as control flow is never returned to the caller with locks held, the subsystem will remain composable.
 
Second (and perhaps counterintuitively), one can achieve concurrency and
composability by having no locks whatsoever. In this case, there must be
no global subsystem state—subsystem state must be captured in per-instance state, and it must be up to consumers of the subsystem to assure that they do not access their instance in parallel. By leaving locking up to the client of the subsystem, the subsystem itself can be used concurrently by different subsystems and in different contexts. A concrete example of this is the AVL tree implementation used extensively in the Solaris kernel. As with any balanced binary tree, the implementation is sufficiently complex to merit componentization, but by not having any global state, the implementation may be used concurrently by disjoint subsystems—the only constraint is that manipulation of a single AVL tree instance must be serialized."
 
Read more here:
 
https://queue.acm.org/detail.cfm?id=1454462
 
More of my philosophy about HP and about the Tandem team and more of my thoughts..
 
 
I invite you to read the following interesting article so that
to notice how HP was smart by also acquiring Tandem Computers, Inc.
with there "NonStop" systems and by learning from the Tandem team
that has also Extended HP NonStop to x86 Server Platform, you can read about it in my below writing and you can read about Tandem Computers here: https://en.wikipedia.org/wiki/Tandem_Computers , so notice that Tandem Computers, Inc. was the dominant manufacturer of fault-tolerant computer systems for ATM networks, banks, stock exchanges, telephone switching centers, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss:
 
https://www.zdnet.com/article/tandem-returns-to-its-hp-roots/
 
More of my philosophy about HP "NonStop" to x86 Server Platform fault-tolerant computer systems and more..
 
Now HP to Extend HP NonStop to x86 Server Platform
 
HP announced in 2013 plans to extend its mission-critical HP NonStop technology to x86 server architecture, providing the 24/7 availability required in an always-on, globally connected world, and increasing customer choice.
 
Read the following to notice it:
 
https://www8.hp.com/us/en/hp-news/press-release.html?id=1519347#.YHSXT-hKiM8
 
And today HP provides HP NonStop to x86 Server Platform, and here is
an example, read here:
 
https://www.hpe.com/ca/en/pdfViewer.html?docId=4aa5-7443&parentPage=/ca/en/products/servers/mission-critical-servers/integrity-nonstop-systems&resourceTitle=HPE+NonStop+X+NS7+%E2%80%93+Redefining+continuous+availability+and+scalability+for+x86+data+sheet
 
So i think programming the HP NonStop for x86 is now compatible with x86 programming.
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
 
More of my philosophy about stack allocation and more of my thoughts..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have just looked at the x64 assembler
of the C/C++ _alloca function that allocates size bytes of space from the Stack and it uses x64 assembler instructions to move RSP register and i think that it also aligns the address and it ensures that it doesn't go beyond the stack limit etc., and i have quickly understood the x64 assembler of it, and i invite you to look at it here:
 
64-bit _alloca. How to use from FPC and Delphi?
 
https://www.atelierweb.com/64-bit-_alloca-how-to-use-from-delphi/
 
 
But i think i am smart and i say that the benefit of using a stack comes mostly from "reusability" of the stack, i mean it is done this way
since you have for example from a thread to execute other functions or procedures and to exit from those functions of procedures and this exiting from those functions or procedures makes the memory of stack available again for "reusability", and it is why i think that using a dynamic allocated array as a stack is also useful since it also offers those benefits of reusability of the stack and i think that dynamic allocation of the array will not be expensive, so it is why i think i will implement _alloca function using a dynamic allocated array and i think it will also be good for my sophisticated coroutines library that you can read about it from my following thoughts about preemptive and non-preemptive timesharing in the following web link:
 
 
https://groups.google.com/g/alt.culture.morocco/c/JuC4jar661w
 
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
 
 
Thank you,
Amine Moulay Ramdane.
Amine Moulay Ramdane <aminer68@gmail.com>: Sep 20 02:25PM -0700

Hello,
 
 
 
More of my philosophy about race conditions and about composability and more of my thoughts..
 
I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
 
 
I say that a model is a representation of something. It captures not all attributes of the represented thing, but rather only those seeming relevant. So my way of doing in software development in Delphi and Freepascal is also that i am using a "model" from the source code that i am executing in Spin model checking so that to detect race conditions, so i invite you to take a look at the following new tutorial that uses the powerful Spin tool:
 
https://mirrors.edge.kernel.org/pub/linux/kernel/people/paulmck/perfbook/perfbook.html
 
So you can for example install Spin model checker so that to detect race conditions, this is how you will get much more professional at detecting deadlocks and race conditions in parallel programming. And i invite you to look at the following video so that to know how to install Spin model checker on windows:
 
https://www.youtube.com/watch?v=MGzmtWi4Oq0
 
More of my philosophy about race detection and concurrency and more..
 
I have just looked quickly at different race detectors, and i think that
the Intel Thread Checker from Intel company from "USA" is also very good since the Intel Thread Checker needs to instrument either the C++ source code or the compiled binary to make every memory reference and every standard Win32 synchronization primitive observable, so this instrumentation from the source code is very good since it also permits me to port my scalable algorithms inventions by for example wrapping them in some native Windows synchronization APIs, and this instrumentation from the source code is also business friendly, so read about different race detectors and about Intel Thread Checker here:
 
https://docs.microsoft.com/en-us/archive/msdn-magazine/2008/june/tools-and-techniques-to-identify-concurrency-issues
 
So i think that the other race detectors of other programming languages have to provide this instrumentation from the source code as Intel Thread Checker from Intel company from "USA".
 
More of my philosophy about Rust and about memory models and about technology and more of my thoughts..
 
 
I think i am highly smart, and i say that the new programming language that we call Rust has an important problem, since read the following interesting article that says that atomic operations that have not correct memory ordering can still cause race conditions in safe code, this is why the suggestion made by the researchers is:
 
"Race detection techniques are needed for Rust, and they should focus on unsafe code and atomic operations in safe code."
 
 
Read more here:
 
https://www.i-programmer.info/news/98-languages/12552-is-rust-really-safe.html
 
 
More of my philosophy about programming languages about lock-based systems and more..
 
I think we have to be optimistic about lock-based systems, since race conditions detection can be done in polynomial-time, and you can notice it by reading the following paper:
 
https://arxiv.org/pdf/1901.08857.pdf
 
Or by reading the following paper:
 
https://books.google.ca/books?id=f5BXl6nRgAkC&pg=PA421&lpg=PA421&dq=race+condition+detection+and+polynomial+complexity&source=bl&ots=IvxkORGkQ9&sig=ACfU3U2x0fDnNLHP1Cjk5bD_fdJkmjZQsQ&hl=en&sa=X&ved=2ahUKEwjKoNvg0MP0AhWioXIEHRQsDJc4ChDoAXoECAwQAw#v=onepage&q=race%20condition%20detection%20and%20polynomial%20complexity&f=false
 
So i think we can continu to program in lock-based systems, and about
composability of lock-based systems, read my following thoughts about it it:
 
More of my philosophy about composability and about Haskell functional language and more..
 
I have just read quickly the following article about composability,
so i invite you to read it carefully:
 
https://bartoszmilewski.com/2014/06/09/the-functional-revolution-in-c/
 
I am not in accordance with the above article, and i think that the above scientist is programming in Haskell functional language and it is for him the way to composability, since he says that the way of functional programming like Haskell functional programming is the
the way that allows composability in presence of concurrency, but for him lock-based systems don't allow it, but i don't agree with him, and i will give you the logical proof of it, and here it is, read what is saying an article from ACM that was written by both Bryan M. Cantrill and Jeff Bonwick from Sun Microsystems:
 
You can read about Bryan M. Cantrill here:
 
https://en.wikipedia.org/wiki/Bryan_Cantrill
 
And you can read about Jeff Bonwick here:
 
https://en.wikipedia.org/wiki/Jeff_Bonwick
 
And here is what says the article about composability in the presence of concurrency of lock-based systems:
 
"Design your systems to be composable. Among the more galling claims of the detractors of lock-based systems is the notion that they are somehow uncomposable:
 
"Locks and condition variables do not support modular programming," reads one typically brazen claim, "building large programs by gluing together smaller programs[:] locks make this impossible."9 The claim, of course, is incorrect. For evidence one need only point at the composition of lock-based systems such as databases and operating systems into larger systems that remain entirely unaware of lower-level locking.
 
There are two ways to make lock-based systems completely composable, and each has its own place. First (and most obviously), one can make locking entirely internal to the subsystem. For example, in concurrent operating systems, control never returns to user level with in-kernel locks held; the locks used to implement the system itself are entirely behind the system call interface that constitutes the interface to the system. More generally, this model can work whenever a crisp interface exists between software components: as long as control flow is never returned to the caller with locks held, the subsystem will remain composable.
 
Second (and perhaps counterintuitively), one can achieve concurrency and
composability by having no locks whatsoever. In this case, there must be
no global subsystem state—subsystem state must be captured in per-instance state, and it must be up to consumers of the subsystem to assure that they do not access their instance in parallel. By leaving locking up to the client of the subsystem, the subsystem itself can be used concurrently by different subsystems and in different contexts. A concrete example of this is the AVL tree implementation used extensively in the Solaris kernel. As with any balanced binary tree, the implementation is sufficiently complex to merit componentization, but by not having any global state, the implementation may be used concurrently by disjoint subsystems—the only constraint is that manipulation of a single AVL tree instance must be serialized."
 
Read more here:
 
https://queue.acm.org/detail.cfm?id=1454462
 
More of my philosophy about HP and about the Tandem team and more of my thoughts..
 
 
I invite you to read the following interesting article so that
to notice how HP was smart by also acquiring Tandem Computers, Inc.
with there "NonStop" systems and by learning from the Tandem team
that has also Extended HP NonStop to x86 Server Platform, you can read about it in my below writing and you can read about Tandem Computers here: https://en.wikipedia.org/wiki/Tandem_Computers , so notice that Tandem Computers, Inc. was the dominant manufacturer of fault-tolerant computer systems for ATM networks, banks, stock exchanges, telephone switching centers, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss:
 
https://www.zdnet.com/article/tandem-returns-to-its-hp-roots/
 
More of my philosophy about HP "NonStop" to x86 Server Platform fault-tolerant computer systems and more..
 
Now HP to Extend HP NonStop to x86 Server Platform
 
HP announced in 2013 plans to extend its mission-critical HP NonStop technology to x86 server architecture, providing the 24/7 availability required in an always-on, globally connected world, and increasing customer choice.
 
Read the following to notice it:
 
https://www8.hp.com/us/en/hp-news/press-release.html?id=1519347#.YHSXT-hKiM8
 
And today HP provides HP NonStop to x86 Server Platform, and here is
an example, read here:
 
https://www.hpe.com/ca/en/pdfViewer.html?docId=4aa5-7443&parentPage=/ca/en/products/servers/mission-critical-servers/integrity-nonstop-systems&resourceTitle=HPE+NonStop+X+NS7+%E2%80%93+Redefining+continuous+availability+and+scalability+for+x86+data+sheet
 
So i think programming the HP NonStop for x86 is now compatible with x86 programming.
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
 
More of my philosophy about stack allocation and more of my thoughts..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have just looked at the x64 assembler
of the C/C++ _alloca function that allocates size bytes of space from the Stack and it uses x64 assembler instructions to move RSP register and i think that it also aligns the address and it ensures that it doesn't go beyond the stack limit etc., and i have quickly understood the x64 assembler of it, and i invite you to look at it here:
 
64-bit _alloca. How to use from FPC and Delphi?
 
https://www.atelierweb.com/64-bit-_alloca-how-to-use-from-delphi/
 
 
But i think i am smart and i say that the benefit of using a stack comes mostly from "reusability" of the stack, i mean it is done this way
since you have for example from a thread to execute other functions or procedures and to exit from those functions of procedures and this exiting from those functions or procedures makes the memory of stack available again for "reusability", and it is why i think that using a dynamic allocated array as a stack is also useful since it also offers those benefits of reusability of the stack and i think that dynamic allocation of the array will not be expensive, so it is why i think i will implement _alloca function using a dynamic allocated array and i think it will also be good for my sophisticated coroutines library that you can read about it from my following thoughts about preemptive and non-preemptive timesharing in the following web link:
 
 
https://groups.google.com/g/alt.culture.morocco/c/JuC4jar661w
 
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
 
 
Thank you,
Amine Moulay Ramdane.
Amine Moulay Ramdane <aminer68@gmail.com>: Sep 20 11:10AM -0700

Hello,
 
 
 
Yet more precision about more of my philosophy about specialization in Economic integration and about productivity and quality and and more of my thoughts..
 
I am a white arab, and i think i am smart since i have also
invented many scalable algorithms and algorithms..
 
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and here is what i have just said about
Economic integration and research & development (R&D) investment:
 
"Economic integration, or regional integration, is an agreement among nations to reduce or eliminate trade barriers and agree on fiscal policies. The European Union, for example, represents a complete economic integration. Strict nationalists may oppose economic integration due to concerns over a loss of sovereignty, so i say that we have not just to look at economic integration , but we have to also look at how much you spend in research & development (R&D), so i think that African leaders committed to investing 1% of GDP in Research and development. With Africa continent's estimated GDP currently at US$2.7 trillion, investment in science and technology is meant to be at US$27 billion, but i think that Africa is very much behind even China in research & development (R&D) investment, since China spend US$514.798 billion in research & development (R&D), and USA spend US$612.714 billion in research & development (R&D), and European Union spend around US$413 billion in research & development (R&D) and India spend US$158.691 billion in research & development (R&D), but i think that the problematic of Africa is that it is "much" less economically integrated than European Union, so then we have also to look at the economic integration of Arab countries and south american countries and such, since the economic integration is so important."
 
 
And so that to be more precise, so i say that Economic integration also benefits from the specialization that talked about it the philosopher and economist Adam Smith the father of economic Liberalism, since when in economy or in a job you for example specialize in what you do better, it has the tendency to higher both productivity and quality, and of course when you higher productivity and quality it also has the tendency to create more jobs, and it is the same for Artificial intelligence and automation, they also higher productivity and quality, so they create more jobs, and since of course you have to think capitalism as not a zero-sum game, and capitalism is not a zero-sum game since for example more wealth is always being produced and since we take natural resources and transform them into the goods that we want. We grow more food, weave cloth, build homes. We produce a larger supply of goods, and new goods which we never had before. Production increases the quantity of real, physical wealth, so then i can say that Economic integration is beneficial in many ways, as it allows countries to specialize and trade without government interference, which can benefit all economies. It results in a reduction of costs and ultimately an increase in overall wealth, and economic integration can facilitate access to a larger consumer base, a greater pool of qualified workers, additional sources of financing, and new technologies.
 
And i have also just talked more about the very important thing that we call specialization , and here is what i have just said about it:
 
More of my philosophy about education and specialization and more of my thoughts..
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so as you notice that i have not scored
115 IQ , but "above" 115 IQ, and now i will give you an IQ test by
inviting you to look at the following video, since i have just looked
at it:
 
 
Why FREE college HURTS the poor - VisualEconomik EN
 
https://www.youtube.com/watch?v=vZvxnk1boAE
 
 
So i am discovering smart patterns with my fluid intelligence in
the above video, and it is that the person that is talking in the above video is not seeing with his fluid intelligence, as i am seeing it with my fluid intelligence, that the school is not specialized for the low IQs that are around 25%, i mean that the schools and the courses of the schools are tuned for the average IQ and to a certain level for the high IQ, and not for the low IQs that are around 25%, so then to solve the problem we have to also "specialize" the school for the low IQs, and it means to have schools for low IQs that learn the low IQs with ease, and it is also valid for the free schools, and it is the way i have just spoken about IQ and specialization by saying the following:
 
 
More of my philosophy about IQ tests and more of my thoughts..
 
 
I think i am highly smart, and I have passed two certified IQ tests and i have scored above 115 IQ, but i have just passed more and more IQ tests, and i have just noticed that the manner in wich we test with IQ tests is not correct, since in an IQ test you can be more specialized and above average in one subtest of intelligence than in another subtest of intelligence inside an IQ test, since IQ tests test for many kind of intelligences such as the spatial and speed and calculations and logical intelligence etc., so i think that you can be really above average in logical intelligence, as i am really above average in logical intelligence, but at the same time you can be below average in calculations and/or spatial.., so since an IQ test doesn't test for this kind of specializations of intelligence, so i think it is not good, since testing for this kind specializations in intelligence is really important so that to be efficient by knowing the strong advantages of this or that person in every types of intelligences. And about the importance of specialization, read carefully my following thought about it:
 
 
More of my philosophy about specialization and about efficiency and productivity..
 
The previous CEO Larry Culp of General Electric and the architect of a strategy that represented a new turning point in the world corporate strategies, Larry Culp's strategy was to divide the company according to its activities. Something like we are better of alone, seperately and
focused on each one's own activity, than together in a large
conglomerate. And it is a move from integration to specialization.
You see it is thought that a company always gains economies of scale
as it grows, but this is not necessarily the case, since as the company
gains in size - especially if it engages in many activities - it
also generates its own bureaucracy, with all that entails in term
of cost and efficiency. And not only that, it is also often the case
that by bringing together very different activities, strategic focus is lost and decision-making is diluted, so that in the end no one ends up
taking responsability, it doesn't always happen, but this reasons are
basically what is driving this increasing specialization. So i invite to look at the following video so that to understand more about it:
 
The decline of industrial icon of the US - VisualPolitik EN
 
https://www.youtube.com/watch?v=-hqwYxFCY-k
 
 
And here is my previous thoughts about specialization and productivity so that to understand much more:
 
More about the Japanese Ikigai and about productivity and more of my thoughts..
 
Read the following interesting article about Japanese Ikigai:
 
The More People With Purpose, the Better the World Will Be
 
https://singularityhub.com/2018/06/15/the-more-people-with-purpose-the-better-the-world-will-be/
 
I think i am highly smart, so i say that the Japanese Ikigai is like a Japanese philosophy that is like the right combination or "balance" of passion, vocation, and mission, and Ikigai and MTP, as concepts, urge us to align our passions with a mission to better the world, but i think that Japanese Ikiai is a also smart since it gets the "passion" from the "mission", since the mission is also the engine, so you have to align the passion with the mission of the country or the global world so that to be efficient, and Japanese Ikigai is also smart since so that to higher productivity and be efficient, you have to "specialize" in doing a job, but so that to higher more productivity and be more efficient you can also specialize in what you do "better", and it is what is doing Japanese Ikigai, since i think that in Japanese Ikigai, being the passion permits to make you specialized in a job in what you do better, and here is what i have just smartly said about productivity:
 
I think i am highly smart, and i have passed two certified IQ tests and i have scored above 115 IQ, and i will now talk about another important idea of Adam Smith the father of economic Liberalism, and it is about "specialization" in an economic system, since i say that in an economic system we have to be specialized in doing a job so that to be efficient and productive, but not only that, but we have to specialize in doing a job in what we do better so that to be even more efficient and productive, and we have to minimize at best the idle time or the wasting of time doing a job, since i can also say that this average idle time or wasting time of the workers working in parallel can be converted to a contention like in parallel programming, so you have to minimize it at best, and you have to minimize at best the coherency like in parallel programming so that to scale much better, and of course all this can create an economy of scale, and also i invite you to read my following smart and interesting thoughts about scalability of productivity:
 
I will talk about following thoughts from the following PhD computer scientist:
 
https://lemire.me/blog/about-me/
 
Read more carefully here his thoughts about productivity:
 
https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/
 
And i think he is making a mistake in his above webpage about productivity:
 
Since we have that Productivity = Output/Input
 
But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law).
 
And it looks like the following:
 
About parallelism and about Gustafson's Law..
 
Gustafson's Law:
 
• If you increase the amount of work done by each parallel
task then the serial component will not dominate
• Increase the problem size to maintain scaling
• Can do this by adding extra complexity or increasing the overall
problem size
 
Scaling is important, as the more a code scales the larger a machine it
can take advantage of:
 
• can consider weak and strong scaling
• in practice, overheads limit the scalability of real parallel programs
• Amdahl's law models these in terms of serial and parallel fractions
• larger problems generally scale better: Gustafson's law
 
 
Load balance is also a crucial factor.
 
 
More of my philosophy about HP and about the Tandem team and more of my thoughts..
 
 
I invite you to read the following interesting article so that
to notice how HP was smart by also acquiring Tandem Computers, Inc.
with there "NonStop" systems and by learning from the Tandem team
that has also Extended HP NonStop to x86 Server Platform, you can read about it in my below writing and you can read about Tandem Computers here: https://en.wikipedia.org/wiki/Tandem_Computers , so notice that Tandem Computers, Inc. was the dominant manufacturer of fault-tolerant computer systems for ATM networks, banks, stock exchanges, telephone switching centers, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss:
 
https://www.zdnet.com/article/tandem-returns-to-its-hp-roots/
 
More of my philosophy about HP "NonStop" to x86 Server Platform fault-tolerant computer systems and more..
 
Now HP to Extend HP NonStop to x86 Server Platform
 
HP announced in 2013 plans to extend its mission-critical HP NonStop technology to x86 server architecture, providing the 24/7 availability required in an always-on, globally connected world, and increasing customer choice.
 
Read the following to notice it:
 
https://www8.hp.com/us/en/hp-news/press-release.html?id=1519347#.YHSXT-hKiM8
 
And today HP provides HP NonStop to x86 Server Platform, and here is
an example, read here:
 
https://www.hpe.com/ca/en/pdfViewer.html?docId=4aa5-7443&parentPage=/ca/en/products/servers/mission-critical-servers/integrity-nonstop-systems&resourceTitle=HPE+NonStop+X+NS7+%E2%80%93+Redefining+continuous+availability+and+scalability+for+x86+data+sheet
 
So i think programming the HP NonStop for x86 is now compatible with x86 programming.
 
And i invite you to read my thoughts about technology here:
 
https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4
 
 
More of my philosophy about stack allocation and more of my thoughts..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have just looked at the x64 assembler
of the C/C++ _alloca function that allocates size bytes of space from the Stack and it uses x64 assembler instructions to move RSP register and i think that it also aligns the address and it ensures that it doesn't go beyond the stack limit etc., and i have quickly understood the x64 assembler of it, and i invite you to look at it here:
 
64-bit _alloca. How to use from FPC and Delphi?
 
https://www.atelierweb.com/64-bit-_alloca-how-to-use-from-delphi/
 
 
But i think i am smart and i say that the benefit of using a stack comes mostly from "reusability" of the stack, i mean it is done this way
since you have for example from a thread to execute other functions or procedures and to exit from those functions of procedures and this exiting from those functions or procedures makes the memory of stack available again for "reusability", and it is why i think that using a dynamic allocated array as a stack is also useful since it also offers those benefits of reusability of the stack and i think that dynamic allocation of the array will not be expensive, so it is why i think i will implement _alloca function using a dynamic allocated array and i think it will also be good for my sophisticated coroutines library.
 
 
More of my philosophy about the simulation hypothesis and more of my thoughts..
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, i invite you to look at the following
video that talks about the simulation hypothesis:
 
Michio Kaku - Is the Universe A Digital Simulation
 
https://www.youtube.com/watch?v=Eb7uZXZHVk8
 
 
So i think i am highly smart, and i think that saying that the Universe is a Digital Simulation is much less probable, since i say that look for example at how looks this thing that we call humanity, so we can not
say that the simulation of this super intelligent beings has started from a very long ago, like for example more than a billion of years, since this way of doing creates too much suffering that is much less probable since the super intelligent beings are so advanced and so smart, but i think it is much more short, so then we notice that the super intelligent beings that have started the simulation can not start the simulation in a so primitive environment of our today of what we call humanity, so then i can say that saying that the Universe is a Digital Simulation is much less probable and the fact the reality exists is much more probable.
 
 
More of my philosophy about the essence of science and technology and more of my thoughts..
 
 
 
I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, so i will ask the following philosophical question of:
 
What is the essence of science and what is
You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page.
To unsubscribe from this group and stop receiving emails from it send an email to comp.programming.threads+unsubscribe@googlegroups.com.

No comments: