Google here...OK enjoy>>>

Popular Posts

Blog Archive

Saturday, July 9, 2011

Parallel Computing?

Traditionally, software has been written for serial computation:
  • To be run on a single computer having a single Central Processing Unit (CPU);
  • A problem is broken into a discrete series of instructions.
  • Instructions are executed one after another.
  • Only one instruction may execute at any moment in time.

Serial computing

In the simplest sense, parallel computing is the simultaneous use of multiple compute resources to solve a computational problem:
  • To be run using multiple CPUs
  • A problem is broken into discrete parts that can be solved concurrently
  • Each part is further broken down to a series of instructions
  • Instructions from each part execute simultaneously on different CPUs


Parallel computing


  • he compute resources can include:
    • A single computer with multiple processors;
    • An arbitrary number of computers connected by a network;
    • A combination of both.
  • The computational problem usually demonstrates characteristics such as the ability to be:
    • Broken apart into discrete pieces of work that can be solved simultaneously;
    • Execute multiple program instructions at any moment in time;
    • Solved in less time with multiple compute resources than with a single compute resource.
 The Universe is Parallel:
  • Parallel computing is an evolution of serial computing that attempts to emulate what has always been the state of affairs in the natural world: many complex, interrelated events happening at the same time, yet within a sequence. For example:
    • Galaxy formation
    • Planetary movement
    • Weather and ocean patterns
    • Tectonic plate drift
    • Rush hour traffic
    • Automobile assembly line
    • Building a space shuttle
    • Ordering a hamburger at the drive through.

Why Use Parallel Computing?

Main Reasons:
  • Save time and/or money: In theory, throwing more resources at a task will shorten its time to completion, with potential cost savings. Parallel clusters can be built from cheap, commodity components.




olve larger problems: Many problems are so large and/or complex that it is impractical or impossible to solve them on a single computer, especially given limited computer memory. For example:
  • "Grand Challenge" (en.wikipedia.org/wiki/Grand_Challenge) problems requiring PetaFLOPS and PetaBytes of computing resources.
  • Web search engines/databases processing millions of transactions per second


Provide concurrency: A single compute resource can only do one thing at a time. Multiple computing resources can be doing many things simultaneously. For example, the Access Grid (www.accessgrid.org) provides a global collaboration network where people from around the world can meet and conduct work "virtually".





chart