Comparison of Programming Paradigms - Wikipedia The Free Encycl
Comparison of Programming Paradigms - Wikipedia The Free Encycl
Comparison of Programming Paradigms - Wikipedia The Free Encycl
http://en.wikipedia.org/wiki/Comparison_of_prog...
This article attempts to set out the various similarities and differences between the various programming paradigms as a summary in both graphical and tabular format with links to the separate discussions concerning these similarities and differences in extant Wikipedia articles.
Contents
1 2 3 4 Main paradigm approaches Differences in terminology Language support Performance comparison 4.1 Managed code 4.2 Pseudocode examples comparing various paradigms 4.2.1 Subroutine, method call overhead 4.2.2 Allocation of dynamic memory for message and object storage 4.2.3 Dynamically dispatched message calls v. direct procedure call overheads 4.3 Serialization of objects 4.4 Parallel computing See also References Further reading External links
5 6 7 8
Paradigm
Description Computation as statements that directly change a program state (datafields) A style of imperative programming with more logical program structure Description
Imperative
Structured
Paradigm
Critics?
1 of 6
03/20/2012 04:54 PM
http://en.wikipedia.org/wiki/Comparison_of_prog...
Paradigm
Description
Related paradigm(s)
Critics?
Functional
Treats computation as the Lambda calculus, compositionality, evaluation of mathematical formula, referential transparency, functions avoiding state and no side effects mutable data Derived from structured programming, based on the concept of modular programming or the procedure call Program flow is determined mainly by events, such as mouse clicks or interrupts including timer Treats datafields as objects manipulated through pre-defined methods only
Procedural
Main loop, event handlers, asynchronous processes Objects, methods, message passing, information hiding, data abstraction, encapsulation, polymorphism, inheritance, serialization-marshalling
Procedural, dataflow
Object-oriented
Declarative
Defines computation logic 4GLs, spreadsheets, report program without defining its detailed generators control flow Treats programs as a model of a finite state machine or any other formal automata Description State enumeration, control variable, Imperative, state changes, isomorphism, state event-driven transition table Main characteristics (examples) Related paradigm(s) Critics?
Differences in terminology
Despite multiple (types of) programming paradigms existing in parallel (with sometimes apparently conflicting definitions), many of the underlying fundamental components remain more or less the same (constants, variables, datafields, subroutines, calls etc.) and must somehow therefore inevitably be incorporated into each separate paradigm with equally similar attributes or functions. The table above is not intended as a guide to precise similarities, but more an index of where to look for more information - based on the different naming of these entities - within each paradigm. Non-standardized implementations of each paradigm in numerous programming languages further complicate the overall picture, especially those languages that support multiple paradigms, each with its own jargon.
"You can know the name of a bird in all the languages of the world, but when you're finished, you'll know absolutely nothing whatever about the bird... So let's look at the bird and see what it's doing-- that's what counts. I learned very early the difference between knowing the name of something and knowing something. Richard Feynman
Language support
Main article: Syntactic sugar Syntactic sugar is a term used to describe the sweetening of program functionality by introducing language features that facilitate particular usage, even if the end result could be achieved without them. One example of syntactic sugar may arguably be classes in C++ (and in Java, C#, etc.). The C language can support object-oriented programming via its facilities of function pointers, type casting, and structures. However, languages such as C++ aim to make object-oriented programming more convenient by introducing syntax specific to this coding style. Moreover, the specialized syntax works to emphasize the object-oriented approach. Similarly, functions and looping syntax in C (and other procedural and structured programming languages) could be considered syntactic sugar. Assembly language can support procedural or structured programming via its facilities for modifying register values and branching execution depending on program state. However, languages such as C introduced syntax specific to these coding styles to make procedural and structured programming more convenient. Features of the C# (C Sharp) programming language, such as properties and interfaces, similarly do not enable new functionality, but are designed to make good programming practices more prominent and natural. Some programmers feel that these features are unimportant or even frivolous. For example, Alan Perlis once quipped, in a reference to bracket-delimited languages, that "syntactic sugar causes cancer of the semicolon" (see Epigrams on Programming).
2 of 6
03/20/2012 04:54 PM
http://en.wikipedia.org/wiki/Comparison_of_prog...
An extension of this is the term "syntactic saccharin", meaning gratuitous syntax which does not make programming [4] easier.
Performance comparison
Purely in terms of total instruction path length, a program coded in an imperative style, without using any subroutines at all, would have the lowest count. However, the binary size of such a program might be larger than the same program coded using subroutines (as in functional and procedural programming) and would reference more "non-local" physical instructions that may increase cache misses and increase instruction fetch overhead in modern processors. The paradigms that use subroutines extensively (including functional, procedural and object-oriented) and do not also use significant inlining (via compiler optimizations) will, consequently, use a greater percentage of total resources on the subroutine linkages themselves. Object oriented programs that do not deliberately alter program state directly, instead using mutator methods (or "setters") to encapsulate these state changes, will, as a direct consequence, have a greater overhead. This is due to the fact that message passing is essentially a subroutine call, but with three more additional overheads: dynamic memory allocation, parameter copying and dynamic dispatch). Obtaining memory from the heap and copying parameters for message passing may involve significant resources that far exceed those required for the state change itself. Accessors (or "getters") that merely return the values of private member variables also depend upon similar message passing subroutines, instead of using a more direct assignment (or comparison), adding to total path length.
Managed code
For programs executing in a managed code environment, such as the .NET Framework, many issues affect performance that [5] are significantly affected by the programming language paradigm and various language features used.
Procedural
1 area proc(r2,res): 2 push stack 5 3 load r2; 6 r3 = r2 * r2; 7 res = r3 * "3.142"; 8 pop stack 9 return; 10 ............................................... main proc: load r; 1 call area(r,result); +load p = address of parameter list; 2 +load v = address of subroutine 'area'; 3 +goto v with return; 4 . . . . .... storage ............. result variable constant "3.142" parameter list variable function pointer (==>area) stack storage
Object-oriented
circle.area method(r2): push stack 7 load r2; 8 r3 = r2 * r2; 9 res = r3 * "3.142"; 10 pop stack 11 return(res); 12,13 ............................................... main proc: load r; 1 result = circle.area(r); +allocate heap storage; 2[See 1] +copy r to message; 3 +load p = address of message; 4 +load v = addr. of method 'circle.area' 5 +goto v with return; 6 . . .... storage ............. result variable (assumed pre-allocated) immutable variable "3.142" (final) (heap) message variable for circle method call vtable(==>area) stack storage
3 of 6
03/20/2012 04:54 PM
http://en.wikipedia.org/wiki/Comparison_of_prog...
1. ^ See section: Allocation of dynamic memory for message and object storage
The advantages of procedural abstraction and object-oriented-style polymorphism are not well illustrated by a small example like the one above. This example is designed principally to illustrate some intrinsic performance differences, not abstraction or code re-use. Subroutine, method call overhead The presence of a (called) subroutine in a program contributes nothing extra to the functionality of the program regardless of paradigm, but may contribute greatly to the structuring and generality of the program, making it much easier to write, [10] modify, and extend. The extent to which different paradigms utilize subroutines (and their consequent memory requirements) influences the overall performance of the complete algorithm, although as Guy Steele pointed out in a 1977 paper, a well-designed programming language implementation can have very low overheads for procedural abstraction (but laments, in most implementations, that they seldom achieve this in practice - being "rather thoughtless or careless in this regard"). In the same paper, Steele also makes a considered case for automata-based programming (utilizing procedure calls with tail recursion) and concludes that "we should have a healthy respect for procedure calls" (because they are powerful) but suggested "use them sparingly" [10] In terms of the frequency of subroutine calls: for procedural programming, the granularity of the code is largely determined by the number of discrete procedures or modules. [citation needed] for functional programming, frequent calls to library subroutines are commonplace (but may be frequently inlined by the optimizing compiler) for object-oriented programming, the number of method calls invoked is also partly determined by the granularity of the data structures and may therefore include many read-only accesses to low level objects that are encapsulated (and therefore accessible in no other, more direct, way). Since increased granularity is a prerequisite for greater code reuse, the tendency is towards fine-grained data structures, and a corresponding increase in the number of discrete objects (and their methods) and, consequently, subroutine calls. The creation of god objects is actively discouraged. Constructors also add to the count as they are also subroutine calls (unless they are inlined). Performance problems caused by excessive granularity may not become apparent until scalability becomes an issue. for other paradigms, where a mixture of the above paradigms may be employed, subroutine usage is less predictable. Allocation of dynamic memory for message and object storage Uniquely, the object-oriented paradigm involves dynamic allocation of memory from heap storage for both object creation and message passing. A 1994 benchmark - "Memory Allocation Costs in Large C and C++ Programs" conducted by Digital Equipment Corporation on a variety of software, using an instruction-level profiling tool, measured how many instructions were required per dynamic storage allocation. The results showed that the lowest absolute number of instructions executed averaged around 50 but others reached as high as 611.[11] See also "Heap:Pleasures and pains" by Murali R. Krishnan[12] that states "Heap implementations tend to stay general for all platforms, and hence have heavy overhead". The above pseudocode example does not include a realistic estimate of this memory allocation pathlength or the memory prefix overheads involved and the subsequent associated garbage collection overheads. Suggesting that heap allocation is a non-trivial task, one open source microallocator, by game developer John W. Ratcliff, consists of nearly 1,000 lines of code.[13] Dynamically dispatched message calls v. direct procedure call overheads In their Abstract "Optimization of Object-Oriented Programs Using Static Class Hierarchy Analysis", Jerey Dean, David Grove, and Craig Chambers of the Department of Computer Science and Engineering, at the University of Washington, claim that "Heavy use of inheritance and dynamically-bound messages is likely to make code more extensible and reusable, but it also imposes a signicant performance overhead, compared to an equivalent but non-extensible program written in a non-object-oriented manner. In some domains, such as structured graphics packages, the performance cost of the extra exibility provided by using a heavily object-oriented style is acceptable. However, in other domains, such as basic data structure libraries, numerical computing packages, rendering libraries, and trace-driven simulation frameworks, the cost of message passing can be too great, forcing the programmer to avoid object-oriented programming in the hot spots of their application."
[14]
Serialization of objects
Main article: Serialization Serialization imposes quite considerable overheads when passing objects from one system to another, especially when the transfer is in human-readable formats such as XML and JSON. This contrasts with compact binary formats for non objectoriented data. Both encoding and decoding of the objects data value and its attributes are involved in the serialization process (that also includes awareness of complex issues such as inheritance, encapsulation and data hiding).
Parallel computing
4 of 6
03/20/2012 04:54 PM
http://en.wikipedia.org/wiki/Comparison_of_prog...
Main article: Parallel computing Carnegie-Mellon University Professor Robert Harper in March 2011 wrote: "This semester Dan Licata and I are co-teaching a new course on functional programming for first-year prospective CS majors... Object-oriented programming is eliminated entirely from the introductory curriculum, because it is both anti-modular and anti-parallel by its very nature, and hence unsuitable for a modern CS curriculum. A proposed new course on object-oriented design methodology will be offered at the sophomore level for those students who wish to study this topic." [15]
See also
Comparison of programming languages Comparison of programming languages (basic instructions) Granularity Message passing Subroutine
References
1. ^ Jacobs, B. (2006-08-27). "Object-oriented Programming Oversold" (http://web.archive.org/web/20061015181417/http: //www.geocities.com/tablizer/oopbad.htm) . Archived from the original (http://www.geocities.com/tablizer/oopbad.htm) on 2006-10-15. http://web.archive.org/web/20061015181417/http://www.geocities.com/tablizer/oopbad.htm. 2. ^ Shelly, Asaf (2008-08-22). "Flaws of Object-oriented Modeling" (http://software.intel.com/en-us/blogs/2008/08/22/flaws-of-objectoriented-modeling/) . Intel Software Network. http://software.intel.com/en-us/blogs/2008/08/22/flaws-of-object-oriented-modeling/. Retrieved 2010-07-04. 3. ^ Yegge, Steve (2006-03-30). "Execution in the Kingdom of Nouns" (http://steve-yegge.blogspot.com/2006/03/execution-in-kingdomof-nouns.html) . steve-yegge.blogspot.com. http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html. Retrieved 2010-07-03. 4. ^ "The Jargon File v4.4.7: "syntactic sugar"" (http://www.retrologic.com/jargon/S/syntactic-sugar.html) . http://www.retrologic.com /jargon/S/syntactic-sugar.html. 5. ^ Gray, Jan (June 2003). "Writing Faster Managed Code: Know What Things Cost" (http://msdn.microsoft.com/en-us/library/ms973852) . MSDN. Microsoft. http://msdn.microsoft.com/en-us/library/ms973852. 6. ^ "The True Cost of Calls" (http://hbfs.wordpress.com/2008/12/30/the-true-cost-of-calls/) . wordpress.com. 2008-12-30. http://hbfs.wordpress.com/2008/12/30/the-true-cost-of-calls/. 7. ^ http://en.wikibooks.org/wiki/X86_Disassembly/Functions_and_Stack_Frames 8. ^ Roberts, Eric S. (2008). "Art and Science of Java; Chapter 7: Objects and Memory" (http://www-cs-faculty.stanford.edu/~eroberts /books/ArtAndScienceOfJava/slides/07-ObjectsAndMemory.ppt) . Stanford University. http://www-cs-faculty.stanford.edu/~eroberts /books/ArtAndScienceOfJava/slides/07-ObjectsAndMemory.ppt. 9. ^ Roberts, Eric S. (2008). Art and Science of Java (http://www-cs-faculty.stanford.edu/~eroberts/books/ArtAndScienceOfJava/slides /07-ObjectsAndMemory.ppt) . Addison-Wesley. ISBN 978-0321486127. http://www-cs-faculty.stanford.edu/~eroberts/books /ArtAndScienceOfJava/slides/07-ObjectsAndMemory.ppt. 10. ^ a b Guy Lewis Steele, Jr. "Debunking the 'Expensive Procedure Call' Myth, or, Procedure Call Implementations Considered Harmful, or, Lambda: The Ultimate GOTO". MIT AI Lab. AI Lab Memo AIM-443. October 1977. [1] (http://repository.readscheme.org/ftp/papers/ai-labpubs/AIM-443.pdf) [2] (http://dspace.mit.edu/handle/1721.1/5753) [3] (http://citeseerx.ist.psu.edu/viewdoc /download?doi=10.1.1.72.4404&rep=rep1&type=pdf) 11. ^ David Detlefs and Al Dosser and Benjamin Zorn (1994-06). "Memory Allocation Costs in Large C and C++ Programs; Page 532" (PDF). SOFTWAREPRACTICE AND EXPERIENCE 24 (6): 527542.) 12. ^ Krishnan, Murali R. (1999-02). "Heap: Pleasures and pains" (http://msdn.microsoft.com/en-us/library /ms810466%28v=MSDN.10%29.aspx) . microsoft.com. http://msdn.microsoft.com/en-us/library/ms810466%28v=MSDN.10%29.aspx. 13. ^ "MicroAllocator.h" (http://code.google.com/p/microallocator/) . Google Code. Google. http://code.google.com/p/microallocator/. Retrieved 2012-01-29. 14. ^ Jeffrey Dean, David Grove, and Craig Chambers. Optimization of Object-Oriented Programs Using Static Class Hierarchy Analysis (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.117.2420&rep=rep1&type=pdf) . University of Washington. doi:10.1.1.117.2420 (http://dx.doi.org/10.1.1.117.2420) . http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.117.2420& rep=rep1&type=pdf. 15. ^ Teaching FP to Freshmen (http://existentialtype.wordpress.com/2011/03/15/teaching-fp-to-freshmen/) , from Harper's blog about teaching introductory computer science.[4] (http://existentialtype.wordpress.com/2011/03/15/getting-started/)
Further reading
"A Memory Allocator" (http://g.oswego.edu/dl/html/malloc.html) by Doug Lea "Dynamic Memory Allocation and Linked Data Structures" (http://www.sqa.org.uk/e-learning/LinkedDS01CD /page_01.htm) by (Scottish Qualifications Authority) "Inside A Storage Allocator" (http://www.flounder.com/inside_storage_allocation.htm) by Dr. Newcomer Ph.D
External links
Comparing Programming Paradigms (http://users.ecs.soton.ac.uk/mrd/research/prog.html) by Dr Rachel Harrison and Mr Lins Samaraweera
5 of 6
03/20/2012 04:54 PM
http://en.wikipedia.org/wiki/Comparison_of_prog...
Comparing Programming Paradigms: an Evaluation of Functional and Object-Oriented Programs (http://eprints.ecs.soton.ac.uk/597/) by Harrison, R., Samaraweera, L. G., Dobie, M. R. and Lewis, P. H. (1996) pp. 247254. ISSN 0268-6961 "The principal programming paradigms" (http://www.info.ucl.ac.be/~pvr/paradigmsDIAGRAMeng101.pdf) By Peter Van Roy "Concepts, Techniques, and Models of Computer Programming" (http://www.info.ucl.ac.be/~pvr/book.html) (2004) by Peter Van Roy & Seif Haridi, ISBN 0-262-22069-5 The True Cost of Calls (http://hbfs.wordpress.com/2008/12/30/the-true-cost-of-calls/) - from "Harder, Better, Faster, Stronger" blog by computer scientist Steven Pigeon (http://www.stevenpigeon.org/Publications/) Retrieved from "http://en.wikipedia.org/w/index.php?title=Comparison_of_programming_paradigms&oldid=480749350" Categories: Programming paradigms This page was last modified on 7 March 2012 at 23:40. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. See Terms of use for details. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.
6 of 6
03/20/2012 04:54 PM