MPI: Message Passing Interface - FGCU



MPI: Message Passing Interface

By: Joe Schmitz

CEN 4516 Computer Networks

Florida Gulf Coast University

Ft. Myers, FL

9/29/05

I. Introduction to Network Programming

Ever since the found of Sun Microsystems uttered the words, “The network is the computer,” computer scientists have been studying networks and how computers can be linked together. Some of the basic topics that can be talked about in network programming are security, database connectivity, and the use of network protocols for different applications such as TCP and UDP.

II. Problem Definition

The topic that I’ll be addressing in a technology called MPI, also known as Message Passing Interface. The details of the technology will be presented in a powerpoint presentation and in the next section.

III. Description of the Technology

In short, MPI is a programming tool that allows for parallel computation on a variety of

independent platforms, including Unix and Windows. MPI is a technology that doesn’t have to be tied to one programming language either. It has the ability to run on C, C++, Fortran, and Ada programming languages. Since it’s inception on 1994, it has spawned into other implementations such as Open MPI and OpenMP [2].

IV. Programming Example

mpi_example.c :

#include

...

int main(int *argc, char *argv[]) {

....

MPI_Init(&argc, &argv);

/* MPI functions */

MPI_Finalize();

....

}

In this example, you have to first initialize MPI by calling MPI_Init. The middle is where you would program your functions, such as printf(). MPI_Finalize() is used to clean the MPI environment before the program exits [3].

V. Conclusion

From the initial research the implications of using an MPI program on a large scale, such as research, it can have a profound impact on performance. More research will be done in order to make a stronger conclusion.

VI. References

[1] MPI-2: Extensions to the Message-Passing Interface, University of Tennessee, 1997.

[2] Wikimedia Foundation, Anonymous, Wikipedia, Message Passing Interface, Date accessed: Sept. 28, 2005,

[3] University of Kassel, Anonymous, Parawiki, MPI – Parallel Programming, Feb., 23, 2005,

[4] Gropp, William, Argonne National Laboratory, Argonne, IL,

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download