Six-State Continuous Processing Model for a New Theory of Computing

In contrast to the computation in Von-Neumann Architecture, the human mind executes processing in the brain by improving speed and accuracy over execution cycles. Further, it has been postulated that the memory is a result of continuous processing and is not separated from processing in the human mind. Similar to that mind model, a six-state continuous processing model with the states New, Ready, Running, Blocked, Sleep and Terminate, has been proposed to implement a special compiler which consists of conditionally evolving memory that aids in improving performance in conventional computation by exploiting 24-Casual Relations explained in Buddhist Theory of Mind. The experiments have been conducted to demonstrate how the proposed computing model increases the performance of execution of source codes and compilers. The result shows a clear increase of performance in computation by avoiding overloading the memory, and ensuring the execution of high quality code segments at the right time.

[1]  F. E. Grubbs Procedures for Detecting Outlying Observations in Samples , 1969 .

[2]  William Stallings Computer Organization and Architecture: Designing for Performance , 2010 .

[3]  R. H. Myers,et al.  Probability and Statistics for Engineers and Scientists , 1978 .

[4]  A. S. Karunananda,et al.  Conditionally evolving memory for computers , 2015, 2015 Fifteenth International Conference on Advances in ICT for Emerging Regions (ICTer).

[5]  Fayez Gebali,et al.  Algorithms and Parallel Computing: Gebali/Algorithms and Parallel Computing , 2011 .

[6]  Maria Ulfah Siregar A New Approach to CPU Scheduling Algorithm: Genetic Round Robin , 2012 .

[7]  A. Zwinderman,et al.  SPSS for Starters and 2nd Levelers , 2015, Springer International Publishing.

[8]  S. C. Gupta,et al.  Fundamentals Of Mathematical Statistics , 1972 .

[9]  A. Imam,et al.  On Consistency and Limitation of paired t-test, Sign and Wilcoxon Sign Rank Test , 2014 .

[10]  Abraham Silberschatz,et al.  Operating System Concepts, 9/E. , 2016 .

[11]  Manu Sharma,et al.  Genetic Algorithm Optimal Approach For Scheduling Processes In Operating System , 2013 .

[12]  Alan Jay Smith,et al.  Cache Memories , 1982, CSUR.

[13]  David F. Hendry,et al.  The computer as von Neumann planned it , 1993, IEEE Annals of the History of Computing.

[14]  Asoka S. Karunananda,et al.  New Processing Model for Operating Systems , 2016 .

[15]  William Stallings,et al.  Operating Systems: Internals and Design Principles , 1991 .

[16]  Magnus Carlsson Monads for incremental computing , 2002, ICFP '02.

[17]  Michael Hicks,et al.  Incremental computation with names , 2015, OOPSLA.

[18]  Fayez Gebali,et al.  Algorithms and Parallel Computing , 2011 .

[19]  A. S. Karunananda,et al.  A tactics memory for a new theory of computing , 2013, 2013 8th International Conference on Computer Science & Education.

[20]  A. E. Eiben,et al.  Introduction to Evolutionary Computing , 2003, Natural Computing Series.

[21]  Michael Hicks,et al.  Adapton: composable, demand-driven incremental computation , 2014, PLDI.

[22]  Hesham El-Rewini,et al.  Fundamentals of computer organization and architecture , 2004, Wiley series on parallel and distributed computing.