next up previous
Next: Acknowledgements: Up: Compiler Synthesis of Task Previous: Related Work


Conclusion and Future Plans

In this paper, we described a methodology for automating the process of synthesizing task graphs for parallel programs, using sophisticated parallelizing compiler techniques. The techniques in this paper can be used without user intervention to construct task graphs message-passing programs compiled from HPF source programs, and we believe they extend directly to existing message-passing (e.g., MPI) programs as well. Such techniques can make a large body of existing research based on task graphs and equivalent representations applicable for these widely used programming standards. Our immediate goals for the future are: (1) to demonstrate that the techniques described in this paper can be applied to message-passing programs (using MPI), by extracting the requisite computation partitioning and communication information; and (2) to couple the compiler-generated task graphs with the wide range of modeling approaches being used within the POEMS project, including analytical, simulation and hybrid models.




Subsections

Rizos Sakellariou 2000-10-16