- More of my philosophy about disorder and about God and more.. - 1 Update
- More of my philosophy about the nature of our universe and the other universes.. - 1 Update
- More of my philosophy about the distributed intelligence and more.. - 1 Update
- More of my philosophy about what is time as we know it.. - 1 Update
- More of my philosophy and precision about the link of the article and more.. - 1 Update
- More of my philosophy about coroutines and about setjmp() and longjmp().. - 1 Update
- More of my philosophy about setjmp() and longjmp() and generators and coroutines.. - 1 Update
- More of my philosophy of how the wise man can win the fight.. - 1 Update
- AMD unveils EPYC Milan-X processors with up to 768MB of L3 cache per socket - 1 Update
- What impact is a developer's dissatisfaction likely to have on their productivity? - 1 Update
- Are the Nordic countries really less innovative than the US? - 1 Update
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 07:02PM -0800 Hello, More of my philosophy about disorder and about God and more.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. I think i am smart, and when you are naive you will quickly say that God can not come from "disorder" like from where has come our universe, but i will say that disorder is not a sufficient condition to make God not exist, since in arabic we can say: "Hal min la' wujud ya'tina' al wujud" , and its translation in english is: "Can from the not existent comes the existent ?", so as you are noticing that we can for example say that the many other universes than our universe have always existed, and this can make us say that something other than those universes can have always existed, like the one that we call God. More of my philosophy about the nature of our universe and the other universes.. I think i am smart, and i think that the nature of our universe is that the lowest layer of the subatomic is also made of a kind of "diversification", like in an Ant colony, and i think that this diversification comes from a kind of disorder from evolutive composition from the other universes that are disorder, but i think that there is a special thing to notice that for example the individual subatomic of time seems like made of disorder, but the composition of all the subatomic layer gives an emergence of order or intelligence like the time that we know it with its characteristic of being relative. More of my philosophy about the distributed intelligence and more.. I think i am smart, so i think the intelligence of an Ant colony emerges from a "distributed" intelligence, so an individual Ant can be specialized like in our kind of civilization, but even if an Ant is specialized and he can be viewed as much less capable of constructing the intelligence of an Ant colony, the distributed intelligence of the Ant colony can make emerge the intelligence of an Ant colony, so i think that time is the same, the individual subatomic of time can be like more specialized, but all the subatomic layer can give emergence to time as we know it and to relativity of time. More of my philosophy about what is time as we know it.. I think i am smart, and i think you know me much more now, and now i will explain my point of view of what is time as we know it: I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind of intelligence that knows how to process food, so i think time that is relative as has said it Einstein is also an "emergence", but when you look at time individually from the subatomic point of view you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony. Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 06:38PM -0800 Hello, More of my philosophy about the nature of our universe and the other universes.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. I think i am smart, and i think that the nature of our universe is that the lowest layer of the subatomic is also made of a kind of "diversification", like in an Ant colony, and i think that this diversification comes from a kind of disorder from evolutive composition from the other universes that are disorder, but i think that there is a special thing to notice that for example the individual subatomic of time seems like made of disorder, but the composition of all the subatomic layer gives an emergence of order or intelligence like the time that we know it with its characteristic of being relative. More of my philosophy about the distributed intelligence and more.. I think i am smart, so i think the intelligence of an Ant colony emerges from a "distributed" intelligence, so an individual Ant can be specialized like in our kind of civilization, but even if an Ant is specialized and he can be viewed as much less capable of constructing the intelligence of an Ant colony, the distributed intelligence of the Ant colony can make emerge the intelligence of an Ant colony, so i think that time is the same, the individual subatomic of time can be like more specialized, but all the subatomic layer can give emergence to time as we know it and to relativity of time. More of my philosophy about what is time as we know it.. I think i am smart, and i think you know me much more now, and now i will explain my point of view of what is time as we know it: I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind of intelligence that knows how to process food, so i think time that is relative as has said it Einstein is also an "emergence", but when you look at time individually from the subatomic point of view you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony. Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 06:11PM -0800 Hello, More of my philosophy about the distributed intelligence and more.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. I think i am smart, so i think the intelligence of an Ant colony emerges from a "distributed" intelligence, so an individual Ant can be specialized like in our kind of civilization, but even if an Ant is specialized and he can be viewed as much less capable of constructing the intelligence of an Ant colony, the distributed intelligence of the Ant colony can make emerge the intelligence of an Ant colony, so i think that time is the same, the individual subatomic of time can be like more specialized, but all the subatomic layer can give emergence to time as we know it and to relativity of time. More of my philosophy about what is time as we know it.. I think i am smart, and i think you know me much more now, and now i will explain my point of view of what is time as we know it: I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind of intelligence that knows how to process food, so i think time that is relative as has said it Einstein is also an "emergence", but when you look at time individually from the subatomic point of view you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony. Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 05:47PM -0800 Hello, More of my philosophy about what is time as we know it.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. I think i am smart, and i think you know me much more now, and now i will explain my point of view of what is time as we know it: I think time that is relative as has said it Albert Einstein is an "emergence", so it is like an Ant colony, so if you look at an individual Ant in its colony you will like say that he is like disorder that will not be able to construct what constructs an Ant colony, so then the Ant colony is an "emergence" of intelligence, it is like the stomach of a human that is also an emergence of a kind intelligence that knows how to process food, so i think time that is relative as has said it Einstein is also an "emergence", but when you look at time individually from the subatomic point of view you will like say that it is a disorder that can not construct the time that we know it, and it is by logical analogy like the example of the individual Ant above, but i think that the all subatomic layer has made "emerge" time as we know it, and it is like the emergence of intelligence of an Ant colony. Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 01:10PM -0800 Hello, More of my philosophy and precision about the link of the article and more.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. And notice that the link below of the article that shows the problem of implementing coroutines with just setjmp() and longjmp() is from the last semester of the second year of the course called "CS4411 Operating Systems" from Michigan Technological University, but i think i am smart and those courses are easy for me, so i invite you to read about this course that requires both the course of "CS3331 Concurrent Computing" and "CS3421 Computer Organization, and here it is: http://www.csl.mtu.edu/cs4411.ck/www/Home.html More of my philosophy about coroutines and about setjmp() and longjmp().. I think i am smart, and i will say that with setjmp() and longjmp() you can implement a generator or the like, but you can not implement coroutines with just setjmp() and longjmp(), and so that to understand it, i invite you to read the following article that shows how when you yield from a first function with a longjmp() to the main body of a program and when you call another functions with longjmp(), it can make the stack frames not work correctly, and when you understand it you will not use setjmp() and longjmp() alone so that to implement coroutines, so read the following article so that to understand the problem with the stack frames, and i am understanding it easily: https://www.csl.mtu.edu/cs4411.ck/www/NOTES/non-local-goto/coroutine.html So this is why i have also implemented my sophisticated stackful coroutines library so that to solve this problem, and here is my sophisticated coroutines library and read about it and download it from here: https://sites.google.com/site/scalable68/object-oriented-stackful-coroutines-library-for-delphi-and-freepascal More of my philosophy about setjmp() and longjmp() and generators and coroutines.. I have just quickly implemented setjmp() and longjmp() in x64 assembler, and after that i have just implemented quickly a good example of a generator with my setjmp() and longjmp(), look at it below, and in computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator. So here is my implementations in freepascal and delphi and they are working perfectly: Here is my first unit that implements longjmp() and setjmp() and notice how i am saving the non-volatile registers and how i am coding it in x64 assembler: ====== { Volatile registers: The calling program assumes registers RAX, RCX, RDX, and R8 through R11 are volatile. The contents of registers RBX, RSI, RDI, RBP, RSP, and R12 through R15 are considered non-volatile. Functions return values in RAX. } unit JmpLib64; {$IFDEF FPC} {$ASMMODE intel} {$ENDIF} interface type jmp_buf = record RBX, RSI, RDI, RSP, RBP, RIP, R12, R13, R14, R15: UInt64; end; { setjmp captures the complete task state which can later be used to perform a non-local goto using longjmp. setjmp returns 0 when it is initially called, and a non-zero value when it is returning from a call to longjmp. setjmp must be called before longjmp. } function setjmp(out jmpb: jmp_buf): UInt64; { longjmp restores the task state captured by setjmp (and passed in jmpb). It then returns in such a way that setjmp appears to have returned with the value retval. setjmp must be called before longjmp. } procedure longjmp(const jmpb: jmp_buf; retval: UInt64); implementation function setjmp(out jmpb: jmp_buf): UInt64; assembler;{$IFDEF FPC} nostackframe; {$ENDIF}register; asm { -> RCX jmpb } { <- RAX Result } MOV RDX, [RSP] // Fetch return address (RIP) // Save task state MOV [RCX+jmp_buf.&RBX], RBX MOV [RCX+jmp_buf.&RSI], RSI MOV [RCX+jmp_buf.&RDI], RDI MOV [RCX+jmp_buf.&RSP], RSP MOV [RCX+jmp_buf.&RBP], RBP MOV [RCX+jmp_buf.&RIP], RDX MOV [RCX+jmp_buf.&R12], R12 MOV [RCX+jmp_buf.&R13], R13 MOV [RCX+jmp_buf.&R14], R14 MOV [RCX+jmp_buf.&R15], R15 SUB RAX, RAX @@1: end; procedure longjmp(const jmpb: jmp_buf; retval: UInt64);assembler;{$IFDEF FPC} nostackframe; {$ENDIF}register; asm { -> RCX jmpb } { RDX retval } { <- RAX Result } XCHG RDX, RCX MOV RAX,RCX MOV RCX, [RDX+jmp_buf.&RIP] // Restore task state MOV RBX, [RDX+jmp_buf.&RBX] MOV RSI, [RDX+jmp_buf.&RSI] MOV RDI, [RDX+jmp_buf.&RDI] MOV RSP, [RDX+jmp_buf.&RSP] MOV RBP, [RDX+jmp_buf.&RBP] MOV R12, [RDX+jmp_buf.&R12] MOV R13, [RDX+jmp_buf.&R13] MOV R14, [RDX+jmp_buf.&R14] MOV R15, [RDX+jmp_buf.&R15] MOV [RSP], RCX // Restore return address (RIP) TEST RAX, RAX // Ensure retval is <> 0 JNZ @@1 MOV RAX, 1 @@1: end; end. ================ And here is my example of a generator with my longjmp() and setjmp(): { In computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator. } program test_generator; {$APPTYPE CONSOLE} uses JmpLib64; type PtrInt = ^Integer; var childtask,maintask: jmp_buf; myarr1: array of integer; i,a:integer; Ptr1:PtrInt; function generator(var myarr:array of integer):integer; var i1:integer; val:integer; ptr:PtrInt; begin i1:=0; val:= setjmp(childtask); i1:=val-1; if val=0 then begin new(ptr); ptr^:=myarr1[i1]; longjmp(maintask,uint64(ptr)); end; if val=10 then begin writeln('Exiting child..'); exit; end; inc(i1); new(ptr); ptr^:=myarr1[i1]; longjmp(maintask,uint64(ptr)); end; begin setlength(myarr1,10); for i:=0 to 9 do myarr1[i]:=i; uint64(ptr1):=setjmp(maintask); if ptr1=nil then generator(myarr1); a:=ptr1^; dispose(ptr1); if (a<=length(myarr1)) then begin if a=length(myarr1) then longjmp(childtask,a+1) else begin writeln('Value retuned by generator is: ',a); longjmp(childtask,a+1); end; end; setlength(myarr1,0); end. ==== Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 12:45PM -0800 Hello, More of my philosophy about coroutines and about setjmp() and longjmp().. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. I think i am smart, and i will say that with setjmp() and longjmp() you can implement a generator or the like, but you can not implement coroutines with just setjmp() and longjmp(), and so that to understand it, i invite you to read the following article that shows how when you yield from a first function with a longjmp() to the main body of a program and when you call another functions with longjmp(), it can make the stack frames not work correctly, and when you understand it you will not use setjmp() and longjmp() alone so that to implement coroutines, so read the following article so that to understand the problem with the stack frames, and i am understanding it easily: https://www.csl.mtu.edu/cs4411.ck/www/NOTES/non-local-goto/coroutine.html So this is why i have also implemented my sophisticated stackful coroutines library so that to solve this problem, and here is my sophisticated coroutines library and read about it and download it from here: https://sites.google.com/site/scalable68/object-oriented-stackful-coroutines-library-for-delphi-and-freepascal More of my philosophy about setjmp() and longjmp() and generators and coroutines.. I have just quickly implemented setjmp() and longjmp() in x64 assembler, and after that i have just implemented quickly a good example of a generator with my setjmp() and longjmp(), look at it below, and in computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator. So here is my implementations in freepascal and delphi and they are working perfectly: Here is my first unit that implements longjmp() and setjmp() and notice how i am saving the non-volatile registers and how i am coding it in x64 assembler: ====== { Volatile registers: The calling program assumes registers RAX, RCX, RDX, and R8 through R11 are volatile. The contents of registers RBX, RSI, RDI, RBP, RSP, and R12 through R15 are considered non-volatile. Functions return values in RAX. } unit JmpLib64; {$IFDEF FPC} {$ASMMODE intel} {$ENDIF} interface type jmp_buf = record RBX, RSI, RDI, RSP, RBP, RIP, R12, R13, R14, R15: UInt64; end; { setjmp captures the complete task state which can later be used to perform a non-local goto using longjmp. setjmp returns 0 when it is initially called, and a non-zero value when it is returning from a call to longjmp. setjmp must be called before longjmp. } function setjmp(out jmpb: jmp_buf): UInt64; { longjmp restores the task state captured by setjmp (and passed in jmpb). It then returns in such a way that setjmp appears to have returned with the value retval. setjmp must be called before longjmp. } procedure longjmp(const jmpb: jmp_buf; retval: UInt64); implementation function setjmp(out jmpb: jmp_buf): UInt64; assembler;{$IFDEF FPC} nostackframe; {$ENDIF}register; asm { -> RCX jmpb } { <- RAX Result } MOV RDX, [RSP] // Fetch return address (RIP) // Save task state MOV [RCX+jmp_buf.&RBX], RBX MOV [RCX+jmp_buf.&RSI], RSI MOV [RCX+jmp_buf.&RDI], RDI MOV [RCX+jmp_buf.&RSP], RSP MOV [RCX+jmp_buf.&RBP], RBP MOV [RCX+jmp_buf.&RIP], RDX MOV [RCX+jmp_buf.&R12], R12 MOV [RCX+jmp_buf.&R13], R13 MOV [RCX+jmp_buf.&R14], R14 MOV [RCX+jmp_buf.&R15], R15 SUB RAX, RAX @@1: end; procedure longjmp(const jmpb: jmp_buf; retval: UInt64);assembler;{$IFDEF FPC} nostackframe; {$ENDIF}register; asm { -> RCX jmpb } { RDX retval } { <- RAX Result } XCHG RDX, RCX MOV RAX,RCX MOV RCX, [RDX+jmp_buf.&RIP] // Restore task state MOV RBX, [RDX+jmp_buf.&RBX] MOV RSI, [RDX+jmp_buf.&RSI] MOV RDI, [RDX+jmp_buf.&RDI] MOV RSP, [RDX+jmp_buf.&RSP] MOV RBP, [RDX+jmp_buf.&RBP] MOV R12, [RDX+jmp_buf.&R12] MOV R13, [RDX+jmp_buf.&R13] MOV R14, [RDX+jmp_buf.&R14] MOV R15, [RDX+jmp_buf.&R15] MOV [RSP], RCX // Restore return address (RIP) TEST RAX, RAX // Ensure retval is <> 0 JNZ @@1 MOV RAX, 1 @@1: end; end. ================ And here is my example of a generator with my longjmp() and setjmp(): { In computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator. } program test_generator; {$APPTYPE CONSOLE} uses JmpLib64; type PtrInt = ^Integer; var childtask,maintask: jmp_buf; myarr1: array of integer; i,a:integer; Ptr1:PtrInt; function generator(var myarr:array of integer):integer; var i1:integer; val:integer; ptr:PtrInt; begin i1:=0; val:= setjmp(childtask); i1:=val-1; if val=0 then begin new(ptr); ptr^:=myarr1[i1]; longjmp(maintask,uint64(ptr)); end; if val=10 then begin writeln('Exiting child..'); exit; end; inc(i1); new(ptr); ptr^:=myarr1[i1]; longjmp(maintask,uint64(ptr)); end; begin setlength(myarr1,10); for i:=0 to 9 do myarr1[i]:=i; uint64(ptr1):=setjmp(maintask); if ptr1=nil then generator(myarr1); a:=ptr1^; dispose(ptr1); if (a<=length(myarr1)) then begin if a=length(myarr1) then longjmp(childtask,a+1) else begin writeln('Value retuned by generator is: ',a); longjmp(childtask,a+1); end; end; setlength(myarr1,0); end. ==== Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 12:24PM -0800 Hello, More of my philosophy about setjmp() and longjmp() and generators and coroutines.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. I have just quickly implemented setjmp() and longjmp() in x64 assembler, and after that i have just implemented quickly a good example of a generator with my setjmp() and longjmp(), look at it below, and in computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator. So here is my implementations in freepascal and delphi and they are working perfectly: Here is my first unit that implement longjmp() and setjmp() and notice how i am saving the non-volatile registers and how i am coding it in x64 assembler: ====== { Volatile registers: The calling program assumes registers RAX, RCX, RDX, and R8 through R11 are volatile. The contents of registers RBX, RSI, RDI, RBP, RSP, and R12 through R15 are considered non-volatile. Functions return values in RAX. } unit JmpLib64; {$IFDEF FPC} {$ASMMODE intel} {$ENDIF} interface type jmp_buf = record RBX, RSI, RDI, RSP, RBP, RIP, R12, R13, R14, R15: UInt64; end; { setjmp captures the complete task state which can later be used to perform a non-local goto using longjmp. setjmp returns 0 when it is initially called, and a non-zero value when it is returning from a call to longjmp. setjmp must be called before longjmp. } function setjmp(out jmpb: jmp_buf): UInt64; { longjmp restores the task state captured by setjmp (and passed in jmpb). It then returns in such a way that setjmp appears to have returned with the value retval. setjmp must be called before longjmp. } procedure longjmp(const jmpb: jmp_buf; retval: UInt64); implementation function setjmp(out jmpb: jmp_buf): UInt64; assembler;{$IFDEF FPC} nostackframe; {$ENDIF}register; asm { -> RCX jmpb } { <- RAX Result } MOV RDX, [RSP] // Fetch return address (RIP) // Save task state MOV [RCX+jmp_buf.&RBX], RBX MOV [RCX+jmp_buf.&RSI], RSI MOV [RCX+jmp_buf.&RDI], RDI MOV [RCX+jmp_buf.&RSP], RSP MOV [RCX+jmp_buf.&RBP], RBP MOV [RCX+jmp_buf.&RIP], RDX MOV [RCX+jmp_buf.&R12], R12 MOV [RCX+jmp_buf.&R13], R13 MOV [RCX+jmp_buf.&R14], R14 MOV [RCX+jmp_buf.&R15], R15 SUB RAX, RAX @@1: end; procedure longjmp(const jmpb: jmp_buf; retval: UInt64);assembler;{$IFDEF FPC} nostackframe; {$ENDIF}register; asm { -> RCX jmpb } { RDX retval } { <- RAX Result } XCHG RDX, RCX MOV RAX,RCX MOV RCX, [RDX+jmp_buf.&RIP] // Restore task state MOV RBX, [RDX+jmp_buf.&RBX] MOV RSI, [RDX+jmp_buf.&RSI] MOV RDI, [RDX+jmp_buf.&RDI] MOV RSP, [RDX+jmp_buf.&RSP] MOV RBP, [RDX+jmp_buf.&RBP] MOV R12, [RDX+jmp_buf.&R12] MOV R13, [RDX+jmp_buf.&R13] MOV R14, [RDX+jmp_buf.&R14] MOV R15, [RDX+jmp_buf.&R15] MOV [RSP], RCX // Restore return address (RIP) TEST RAX, RAX // Ensure retval is <> 0 JNZ @@1 MOV RAX, 1 @@1: end; end. ================ And here is my example of a generator with my longjmp() and setjmp(): { In computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators. A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator. } program test_generator; {$APPTYPE CONSOLE} uses JmpLib64; type PtrInt = ^Integer; var childtask,maintask: jmp_buf; myarr1: array of integer; i,a:integer; Ptr1:PtrInt; function generator(var myarr:array of integer):integer; var i1:integer; val:integer; ptr:PtrInt; begin i1:=0; val:= setjmp(childtask); i1:=val-1; if val=0 then begin new(ptr); ptr^:=myarr1[i1]; longjmp(maintask,uint64(ptr)); end; if val=10 then begin writeln('Exiting child..'); exit; end; inc(i1); new(ptr); ptr^:=myarr1[i1]; longjmp(maintask,uint64(ptr)); end; begin setlength(myarr1,10); for i:=0 to 9 do myarr1[i]:=i; uint64(ptr1):=setjmp(maintask); if ptr1=nil then generator(myarr1); a:=ptr1^; dispose(ptr1); if (a<=length(myarr1)) then begin if a=length(myarr1) then longjmp(childtask,a+1) else begin writeln('Value retuned by generator is: ',a); longjmp(childtask,a+1); end; end; setlength(myarr1,0); end. ==== Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 11:23AM -0800 Hello, More of my philosophy of how the wise man can win the fight.. I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. How the smart and culturally superior, but wise humans can win the fight ? I think that we have to "abstract" correctly and we have to make the abstractions "easy" to understand so that we convince quickly a great number of people and so that Democracy works correctly, and we have then to make understand the "mechanisms" that bring more and more quality, so we have then to know how to convince people to follow those mechanisms and we have to know how to enforce those mechanisms with rules or laws. And read my philosophy in the following link: https://groups.google.com/g/alt.culture.morocco/c/ag_ziCVV0VA And read my following proverbs that i think are flexible from the start and that i have just invented quickly, here they are and read them carefully: https://groups.google.com/g/alt.culture.morocco/c/ZyUvFt_nix8 And read my following poems of Love that i think are flexible from the start, here they are and read them carefully: https://groups.google.com/g/alt.culture.morocco/c/qte9bCZiOiw Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 10:22AM -0800 Hello, I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. AMD unveils EPYC Milan-X processors with up to 768MB of L3 cache per socket AMD announced the first official details for the Milan-X chips – a modification of the EPYC Milan server processors with 3D-stacked L3 cache (dubbed 3D V-Cache), which can be up to 768 MB in size. That massive hunk of cache can speed up workloads by 50% on average. The memory latency of EPYC Milan-X chips has been lowered by 51% within a NUMA node and 42% lower between nodes. Read more here: https://www.gsmarena.com/amd_unveils_epyc_milanx_processors_with_up_to_768mb_of_l3_cache_per_socket-news-51767.php Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 08:58AM -0800 Hello, I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. What impact is a developer's dissatisfaction likely to have on their productivity? Reality, supported by scientific studies , shows that a worker is more productive when he is happy in the workplace. Read more here: https://emploi-developpez-com.translate.goog/actu/328717/Quel-impact-le-mecontentement-d-un-developpeur-est-il-susceptible-d-avoir-sur-sa-productivite-Perte-de-motivation-Manque-de-concentration-Non-respect-des-delais-Partagez-votre-experience/?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en&_x_tr_pto=nui Are the Nordic countries really less innovative than the US? Though we tend to view American businesses as uniquely innovative, Nordic countries in Europe may be just as innovative, albeit with a small population. Read more here: https://voxeu.org/article/nordic-innovation-cuddly-capitalism-really-less-innovative Thank you, Amine Moulay Ramdane. |
| Amine Moulay Ramdane <aminer68@gmail.com>: Nov 23 08:15AM -0800 Hello, I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms.. Are the Nordic countries really less innovative than the US? Though we tend to view American businesses as uniquely innovative, Nordic countries in Europe may be just as innovative, albeit with a small population. This could be explained in part by the fact that Nordic countries have advantageous tax regimes for businesses. Read more here: https://voxeu.org/article/nordic-innovation-cuddly-capitalism-really-less-innovative Thank you, Amine Moulay Ramdane. |
| You received this digest because you're subscribed to updates for this group. You can change your settings on the group membership page. To unsubscribe from this group and stop receiving emails from it send an email to comp.programming.threads+unsubscribe@googlegroups.com. |
No comments:
Post a Comment