The Journal of Instruction-Level
Parallelism Championship Branch Prediction |
|||||||||||||||
|
The workshop on computer architecture competitions is a
forum for holding competitions to evaluate computer architecture research
topics. The second workshop is organized around a competition for branch
prediction algorithms. The Championship Branch Prediction (CBP) invites
contestants to submit their branch prediction code to participate in this
competition. Contestants will be given a fixed storage budget to implement
their best predictors on a common evaluation framework provided by the
organizing committee. Objective The goal for this competition is to compare different
branch prediction algorithms in a common framework. Predictors will be
evaluated for two tracks: conditional branches and indirect
branches. Predictors must be implemented within a fixed storage
budget as specified in the competition rules. The simple and transparent
evaluation process enables dissemination of results and techniques to the
larger computer architecture community and allows independent verification of
results. Prizes The championship has two tracks: A conditional branch
prediction track and an indirect branch prediction track. The top performer
for each track will receive a trophy commemorating his/her triumph (OR some
other prize to be determined later). Top submissions will be invited to
present at the workshop, when results will be announced. All source code,
write-ups and performance results will be made publicly available through the
JWAC-2 website. Authors of accepted workshop papers will be invited to submit
full papers for possible inclusion in a special issue of the Journal of
Instruction-Level Parallelism (JILP). Inclusion in the special issue will
depend on the outcome of JILP's peer-review process: invited papers will be
held to the same standard as regular submissions. Submission Requirements Each contestant is allowed a maximum of three
submissions to the competition. Each submission should include the following: o Abstract: A 300-word abstract summarizing
the submission. In addition, the abstract should include the competition
track for the submission, author names, their affiliations, and the email
address of the contact author. o Paper: This will be a
conference-quality write-up of the branch prediction algorithm, including
references to relevant related work. The paper must clearly describe how the
algorithm works, how it is practical to implement, and how it conforms to the
contest rules and fits within the allocated storage budget. The paper must be
written in English and formatted as follows: no more than four pages,
single-spaced, two-column format, minimum 10pt Times New Roman font. The
paper should be submitted in .pdf format, and should be printable on
letter-size paper with one-inch margins on all sides. A submission will be
disqualified if the paper does not clearly describe the algorithm that
corresponds to the submitted code. Papers that do not conform to the length
and format rules will only be reviewed at the discretion of the program committee.
For contestants with more than one submission, papers need to be sufficiently
different, and not variations of the same basic idea. If papers are
variations on the same idea, contestants should only submit the version that
has the best performance. o Results: A table that gives performance
of the branch prediction code for the distributed trace list (which will be
verified independently by the organizing committee). o Branch Prediction Code: Two files (predictor.cc and
predictor.h) that contain the predictor. This code must be well commented so
that it can be understood and evaluated. Unreadable or insufficiently
documented code will be rejected by the program committee. The code should be
compiled and run on the existing infrastructure without changing any code or
Makefile, and should NOT require any library code that is not part of C++.
All code should be in ANSI C/C++ and POSIX conformant. We will compile the
code using GCC/G++ version 3.3.3 (or higher) on a 64-bit GNU/Linux system,
and if we can't compile and run the code, we can't evaluate the predictor. Click here
for details on how to submit these files. Competition
Rules The competition will proceed as follows. Contestants are
responsible for implementing and evaluating their algorithm in the
distributed framework. Submissions will be compiled and run with the original
version of the framework. Quantitatively assessing the cost/complexity of
predictors is difficult. To simplify the review process, maximize
transparency, and minimize the role of subjectivity in selecting a champion,
CBP-3 will make no attempt to assess the cost/complexity of predictor
algorithms. Instead, contestants have a storage budget of (64K + 1K) bytes,
i.e., a total of 65 Kilobytes, or 66560 bytes. All predictors must be
implemented within the constraints of this budget for the track of choice.
Clear documentation, in the code as well as the paper writeup, must be
provided to assure that this is the case. Predictors will be scored on
Misprediction penalty per thousand instructions (MPPKI) only. The arithmetic
mean of MPPKIs of all 40 traces will be used as the final score of a
predictor. Acceptance Criteria In the interest of assembling a quality program for
workshop attendees and future readers, there will be an overall selection
process, of which performance ranking is the primary component. To be
considered, submissions must conform to the submission requirements described
above. Submissions will be selected to appear in the workshop on the basis of
the performance ranking, novelty, practicality of the predictor, and overall
quality of the paper and commented code. Novelty is not a strict requirement,
for example, a contestant may submit his/her previously published design or
make incremental enhancements to previously proposed design. In such cases,
performance is a heavily weighted criterion, as is overall quality of the
paper (for example, analysis of new results on the common framework, etc.). Description of the
Simulation Infrastructure CBP3 Kit:
Download and Directions (Available after February 15, 2011) Important Dates
|
||||||||||||||
Steering Committee Alaa
R. Alameldeen, Intel Eric
Rotenberg, NC State Organizing Committee Alaa R. Alameldeen, Intel Hongliang Gao, Intel (Chair) Chris
Wilkerson, Intel Program Chair Trevor
Mudge, Program Committee Alaa Alameldeen, Intel Hongliang Gao, Intel Daniel Jimenez, UT-San Antonio Yale Patt, Andre Seznec, INRIA Lucian Vintan, Chris Wilkerson, Intel |
|||||||||||||||
Affiliated
Logos |
JILP |