Programming Paradigms - Cse.iitd.ac.in

2y ago
86 Views
4 Downloads
532.28 KB
16 Pages
Last View : 2d ago
Last Download : 3m ago
Upload by : Asher Boatman
Transcription

Programming ParadigmsØØØØImperative Programming – Fortran, C, PascalFunctional Programming – LispObject Oriented Programming – Simula, C ,SmalltalkLogic Programming - Prolog1Parallel ProgrammingA misconception occurs that parallel programs aredifficult to write as compared to sequentialprogrammes. Consider the situations:üüWrite a sequential programme and its multiple copieson a parallel computer. Parallelism remainstransparent to the user.Write an Oracle application. Oracle is implicitlyparallel. Run it on a parallel machine. Similarlyparallel C and FORTRAN90.21

Parallel ProgrammingA parallel computer should be flexible and easyto use. This will depend upon its architectureand the way we write a parallel program on it.Let us consider various Parallel Programmingparadigms:3Parallel Programming ParadigmvPhase parallelvDivide and conquervPipelinevProcess farmvWork poolvRemark :The parallel program consists of number of supersteps, and each super step has two phases :4computation phase and interaction phase2

Phase Parallel ModellCC. . .CSynchronous InteractionCC. . .ClllSynchronous InteractionlThe phase-parallel model offers aparadigm that is widely used inparallel programming.The parallel program consists of anumber of supersteps, and each hastwo phases.In a computation phase, multipleprocesseseachperformanindependent computation C.In the subsequent interaction phase,the processes perform one or moresynchronous interaction operations,such as a barrier or a blockingcommunication.Then next superstep is executed.5Phase Parallel ModelThis paradigm is also known as LooselySynchronous Paradigm or the AgendaParadigm.63

Phase Parallel ModelShortcomingsØInteraction is not overlapped with computationØIt is difficult to maintain balanced workloadamongst processors.7Phase Parallel ModelA special case of Phase-Parallel Paradigm isSynchronous Iteration Paradigm where thesupersteps are a sequence of iterations in aloop.Consider the example of computing x f(x)where x is an n-dimensional vector.84

Synchronous Iteration Paradigmparfor (i 0; i n; i ) //create n processes//each executing a for loop{for (j 0; j N; j ){x[i] fi(x);barrier;}}9Synchronous Iteration ParadigmFor n 9 we havex0 f0(x)IterateN timesx1 f1(x)x9 f9(x)Barrier Synchronisation105

Asynchronous Iteration Paradigmparfor (i 0; i n; i ){for (j 0; j N; j )x[i] fi(x);}It allows a process to proceed to the next iteration,without waiting for the remaining processes to catchup.11Asynchronous Iteration ParadigmThe above code could be indeterminate, becausewhen a process is computing x[i] in the jthiteration, the x[i-1] value used could becomputed by another process in iteration j-1.However, under certain conditions, anasynchronous iteration algorithm will convergeto the correct results and is faster than thesynchronous iteration algorithm.126

Divide and ConquerllllA parent process divides itsworkload into several smallerpieces and assigns them to anumber of child processes.The child processes thencompute their workload inparallel and the results aremerged by the parent.The dividing and the mergingprocedures are done recursively.This paradigm is very naturalfor computations such as quicksort. Its disadvantage is thedifficulty in achieving good loadbalance.13PipelineData streamlIn pipeline paradigm, anumber of processes form avirtual pipeline.PlQRA continuous data stream isfed into the pipeline, and theprocesses execute at differentpipeline stages simultaneouslyin an overlapped fashion.147

Process FarmlData streamlMasterlSlaveSlaveSlavelThis paradigm is also known as themaster-slave paradigm.A master process executes theessentially sequential part of theparallel program and spawns anumber of slave processes to executethe parallel workload.When a slave finishes its workload, itinforms the master which assigns anew workload to the slave.This is a very simple paradigm, wherethe coordination is done by themaster.15Process FarmDisadvantageThe master can become a bottleneck.168

Work PoolllWork poollWorkPoollPPPllThis paradigm is often used in a sharedvariable model.A pool of works is realized in a global datastructure.A number of processes are created.Initially, there may be just one piece ofwork in the pool.Any free process fetches a piece of workfrom the pool and executes it, producingzero, one, or more new work pieces putinto the pool.The parallel program ends when the workpool becomes empty.This paradigm facilitates load balancing, asthe workload is dynamically allocated tofree processes.17Work PoolImplementing the work pool to allow efficientaccesses by multiple processes is not easyespecially in a message passing model. Thework pool may be an unordered set, a queue ora priority queue.189

Programmability IssuesWe define programmability as a combination lel Programming ModelsImplicit parallelismlIf the programmer does not explicitly specifyparallelism, but let the compiler and the run-timesupport system automatically exploit it.Explicit ParallelismlIt means that parallelism is explicitly specified inthe source code by the programming using speciallanguage constructs, complex directives, or library20cells.10

Implicit Parallel Programming ModelsImplicit Parallelism: Parallelizing CompilersvAutomatic parallelization of sequential programs Dependency AnalysisData dependencyControl dependencyRemarkUsers belief is influenced by the currentlydisappointing performance of automatic tools(Implicit parallelism) and partly by a theoretical21results obtainedImplicit ParallelProgramming ModelsEffectiveness of Parallelizing Compilersv Question : Areparallelizingcompilerseffectiveingeneralizing efficient code from sequentialprograms?– Some performance studies indicate that maynot be a effective– User direction and Run-Time Parallelizationtechniques are needed2211

Implicit Parallel Programming ModelsImplicit ParallelismvBernstein’s Theorem It is difficult to decide whether two operationsin an imperative sequential program can beexecuted in parallel An implication of this theorem is that there is noautomatic technique, compiler time or runtimethat can exploit all parallelism in a sequential23programImplicit Parallel Programming ModelsTo overcome this theoretical limitation, two solutionshave been suggested The first solution is to abolish the imperativestyle altogether, and to use a programminglanguage which makes parallelism recognitioneasier The second solution is to use explicit24parallelism12

Explicit Parallel Programming ModelsThree dominant parallel programmingmodels are :vData-parallel modelvMessage-passing modelvShared-variable model25Explicit Parallel Programming ModelsMainFeaturesControl ariableSingleMultipleMultipleSynchronyLoosely synchronousAddress tInteractionDataallocationImplicit icit orsemiexplicit2613

Explicit Parallel Programming ModelsMessage – PassingvMessage passing has the following characteristics :ØMultithreadingØAsynchronous parallelism (MPI reduce)ØSeparate address spaces (Interaction byMPI/PVM)ØExplicit interactionØExplicit allocation by user27Explicit ParallelProgramming ModelsMessage – Passing Programs are multithreading and asynchronousrequiring explicit synchronization More flexible than the data parallel model, but itstill lacks support for the work pool paradigm. PVM and MPI can be used Message passing programs exploit large-grainparallelism2814

Explicit Parallel Programming ModelsShared Variable ModellIt has a single address space (Similar to data parallel)lIt is multithreading and asynchronousmessagemessage-passing model)lData resides in single shared address space, thus doesnot have to be explicitly allocatedlWorkload can be either explicitly or implicitlyallocatedlCommunication is done implicitly through zation is explicit(Similar toExplicit Parallel Programming ModelsShared variable modellThe sharedshared-variable model assumes the existence of asingle, shared address space where all shared data residelPrograms are multithreading and asynchronous, requiringexplicit synchronizationslEfficient parallel programs that are loosely synchronousand have regular communication patterns, the sharedvariable approach is not easier than the message passingmodel3015

Other Parallel Programming ModelslFunctional programminglLogic programminglComputing by learninglObject oriented programming3116

1 1 Programming Paradigms ØImperative Programming – Fortran, C, Pascal ØFunctional Programming – Lisp ØObject Oriented Programming – Simula, C , Smalltalk ØLogic Programming - Prolog 2 Parallel Programming A misconception occurs that parallel

Related Documents:

92 vipul sharma it 93 rishabh jain cse 94 manik arora cse 95 nishant bhardwaj cse . 96 rajit shrivastava it 97 shivansh gaur cse 98 harsh singh cse 99 shreyanshi raj cse 100 rahul bedi cse 101 pallavi anand cse 102 divya cse 103 nihal raj it 104 kanak

cse-148 kuriakose jijo george n t george cse-149 kusum joshi ramesh chandra joshi cse-150 m mithun bose n k mohandasan cse-151 madhuri yadav rajbir yadav cse-152 malini shukla r s sharma cse-153 manisha khattar sunil kumar khattar cse-154 m

Paradigms of Programming Computation is much more fundamental (and older) than computers or computer science. There are several paradigms (aka ways) of specifying computations. But not as many as there are programming languages. Understanding the different paradigms of programming emp

Abstract - A Programming Paradigm is the silent intelligence in any software design. Although many Programming Paradigms have evolved, only a few programming paradigms are actively used by the software industry. In addition, many hundreds of programming languages have been developed, but only a few are established and beneficial.

Four Sociological Paradigms In their article "Four Paradigms of Information Systems Development" Hirschheim and Klein use four sociological paradigms suggested by Burrell and Morgan's (1985). Those four sociological paradigms are presented in relation to objective vs. subjective and order vs. conflict dichotomies and here follows the .

Programming Paradigms We can distinguish programming languages in a variety of ways. One of the most important is which programming paradigm (or programming paradigms, as many languages support more than one paradigm) is the language based on. Let’s take a look at some of the most important paradigms:

1. Programming Paradigms Before we start on the functional programming paradigm we give a broad introduction to programming paradigms in general. In this section we will discuss the meaning of the word 'paradigm', and we will enumerate the main programming paradigms, as we see them.

of branched rough paths introduced in (J. Differential Equations 248 (2010) 693–721). We first show that branched rough paths can equivalently be defined as γ-Hölder continuous paths in some Lie group, akin to geometric rough paths. We then show that every branched rough path can be encoded in a geometric rough path. More precisely, for every branched rough path Xlying above apathX .