GraphGrind: addressing load imbalance of graph partitioning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Citations (Scopus)
545 Downloads (Pure)

Abstract

We investigate how graph partitioning adversely affects the performance of graph analytics. We demonstrate that graph partitioning induces extra work during graph traversal and that graph partitions have markedly different connectivity than the original graph. By consequence, increasing the number of partitions reaches a tipping point after which overheads quickly dominate performance gains. Moreover, we show that the heuristic to balance CPU load between graph partitions by balancing the number of edges is inappropriate for a range of graph analyses. However, even when it is appropriate, it is sub-optimal due to the skewed degree distribution of social networks.
Based on these observations, we propose GraphGrind, a new graph analytics system that addresses the limitations incurred by graph partitioning. We moreover propose a NUMA-aware extension to the Cilk programming language and obtain a scale-free yet NUMA-aware parallel programming environment which underpins NUMA-aware scheduling in GraphGrind. We demonstrate that GraphGrind outperforms state-of-the-art graph analytics systems for shared memory including Ligra, Polymer and Galois.


Original languageEnglish
Title of host publicationProceedings of the International Conference on Supercomputing
Number of pages10
ISBN (Electronic)978-1-4503-5020-4
DOIs
Publication statusPublished - 14 Jun 2017
EventInternational Conference on Supercomputing 2017 - Chicago,IL,USA, Chicago, United States
Duration: 14 Jun 201716 Jun 2017
http://press3.mcs.anl.gov/ics2017/

Conference

ConferenceInternational Conference on Supercomputing 2017
Abbreviated titleICS'17
CountryUnited States
CityChicago
Period14/06/201716/06/2017
Internet address

Fingerprint Dive into the research topics of 'GraphGrind: addressing load imbalance of graph partitioning'. Together they form a unique fingerprint.

  • Cite this