Skip to content

pnnl/qualm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

QuaL²M (QuaLM): Quantitative Learned Latency Model

Home:

About: QuaL²M (QuaLM), or Quantitative Learned Latency Model, is the implementation of a deep learning methodology for quantitative performance of optimized latency-sensitive code on CPUs. QuaLM distinguishes superblock behavior by combining lightweight telemetry from performance monitoring units (PMUs) and readily obtainable compiler execution models. To capture the cost distribution and the most severe bottlenecks, QuaLM combines classification and regression using ensemble decision trees, which also provide some interpretability.

Contacts: (firstname.lastname@pnnl.gov)

  • Arun Sathanur
  • Nathan R. Tallent (www), (www)

References

  • Arun Sathanur, Nathan R. Tallent, P. Konsor, K. Koyanagi, R. McLaughlin, J. Olivas, and M. Chynoweth, "QuaL²M (QuaLM): Learning quantitative performance of latency-sensitive code," in Proc. of the 2022 IEEE Intl. Parallel and Distributed Processing Symp. Workshops (17th Intl. Workshop on Automatic Performance Tuning), May 2022

Details

The repo consists of three main classes (and the associated methods) and example driver scripts to perform analysis.

a. preproces.py : Contains all the methods related to data pre-processing, plotting distributions etc.

b. supervised.py : Contains all the methods related to supervised learning (regression, classification)

c. unsupervised.py : Contains all the methods related to un-supervised learning (PCA, clustering etc.)

The main driver scripts utilized thus far are:

a. analysisMain.py : Used for a typical analyusis pipeline

b. multiScaleModels.py : Used to execute the proposed multi-stage model

The data excel file, the associated target function, the range of columns for the features are all typically included as part of the driver scripts and are usually self explanatory. Currently the scripts support both MCA and PMU features and options exist to model either or both. Based on the flags, appropriate normalizations are performed.

About

QuaLM

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages