This tool is an initiative for a Source-to-Source Parallelizing Compiler that creates human readable paralleization metadata. This tool is written in Python which relies on the pycparser to annotate C99 source code with OpenMP and OpenACC directives given a metadata file called parallel.yml.
Suppose we have to parallelize the code of the following function:
#include <stdlib.h>
#include <stdio.h>
#define N 10
void some_function(){
int A[N];
int B[N];
int C[N];
for(int i=0; i<N; ++i){
C[i] = A[i] + B[i];
}
}
int main(){
some_function();
}
So we create a parallel.yml as shown below.
version: 1.0
name: 'example'
description: |
This is a template of a parallel file.
functs:
all: # List the functions available in the source code
- main
- some_function
# Defines just the functions that are paralleizable and how to parallelize them.
parallel:
# Function
evolve:
# OpenMP direcives
mp:
# Parallel directive, apply or enclose the loops defined in the for directive
parallel_for:
# Loop, the first lexicographic loop inside the function some_function.
- nro: 0
clauses:
private: [i]
reduction: '+:sum'
#pragcc takes the parallel.yml and the source code to annotate it as shown below:
#include <stdlib.h>
#include <stdio.h>
#define N 10
void some_function(){
int A[N];
int B[N];
int C[N];
#pragma omp parallel for private(i) reduction(+:sum)
for(int i=0; i<N; ++i){
C[i] = A[i] + B[i];
}
}
int main(){
some_function();
}
Note: The correct paralelization of the program depends of how the Parallel File is defined.
To test the Pragcc API-REST, install the api module requirements and init the api server as follows:
pip3 install -r ./api/requirements.txt
python3 api/app.py
Perform the following commands to build the pragcc image, then to run the pragcc API in a container. These commands need to be performed from the pragcc project root directory.
docker build -t pragcc .
docker run -d -v ${PWD}:/usr/src/app --name pragcc -p 5000:5000 pragcc
- The is not errors handling.
- If the parallel.yml file has not a correct format the annotation process can fail or the resulted code is not annotated. So the user needs to know what happend.
- The code to parallelize must be c99 source code. which is the version of C supported by pycpasrer.
- The code must have at least two include headers.
- The code should be organized as follows.
/* Headers */
#include <stdlib.h> /* Standard Library: malloc, calloc, free, ralloc */
#include <stdbool.h> /* Bool type libary */
/* Declarations */
#define RowDim 20
/* Functions */
int main(int argc, char const **argv)
{
return EXIT_SUCCESS;
}
-
Since the parallelization of the C99 code is not automatic, i.e, it is necessary for the user to create the parallel.yml file, So we can code a code analyzer which analyzes the source code and generates the parallel.yml metadata.
-
Solve include headers issue.
-
Add error handling.
-
Test what happend if some key is missing in the parallel.yml file, for example, if the parallel for directive does not have the clauses key.
-
Create a TestCase for each directive, checking the behavior of pragcc when some keys are mising.
-
Test the parallelize method properly.
-
Check when a give directive clauses are misspelled for example the user write 'num_theras' instead 'num_threds'.
Structured Parallel Programming: Patterns for Efficient Computation
Designing a RESTful API using Flask-RESTful