LDBC Social Network Benchmarking

Processes, policies and audit workflow for the LDBC Social Network Benchmark

Introduction

LDBC’s Social Network Benchmark (LDBC SNB) is an effort intended to test various functionalities of systems used for graph-like data management. For this, LDBC SNB uses the recognizable scenario of operating a social network, characterized by its graph-shaped data.

The LDBC SNB benchmark is modeled around the operation of a real social network site. This represents a relevant use case for the following reasons:

  • It is simple to understand for a large audience, as it is arguably present in our every-day life in different shapes and forms.
  • It allows testing a complete range of interesting challenge,s by means of different workloads targeting systems of different nature and characteristics.
  • A social network can be scaled, allowing the design of a scaleable benchmark targeting systems of different sizes and budgets.

Our service

Whether you’re an end-user looking for support in technology selection as you embark on a digital transformation around the implementation of graph technologies, or a vendor looking to understand the performance characteristics of their software for direction and product positioning, our implementation of the LDBC SNB audit is an effective measure to assure the performance of graph-like data management technologies.

Benchmark Overview

  • Rich Coverage  LDBC SNB is intended to cover most demands encountered in the management of complexly structured data.
  • Modularity  LDBC SNB is broken into parts that can be individually addressed. LDBC SNB stimulated innovation without imposing a high threshold for participation.
  • Reasonable implementation cost  For a product offering relevant functionality, the effort for obtaining initial results with SNB should be small, in the order of days.
  • Relevant selection of challenges  Benchmarks are known to direct product development in certain directions. LDBC SNB is informed by state-of-the-art in database research so as to offer optimisation challenges for years to come while not having a prohibitively high threshold for entry.
  • Reproducibility and documentation of results  LDBC SNB will specify the rules for full disclosure of benchmark execution and for auditing of benchmark runs. The workloads may b e run on any equipment but the exact configuration and price of the hardware and software must be disclosed.

History and Further Information

The team at Chorograph audited FMA Technologies flagship product TuGraph 1.10 (formerly known as Lightgraph) using the SNB Interactive Workload.

Ben Steer of Chorograph, also a board member of the LDBC, played a leading role in the development of the most successful and enduring industry standard benchmarks from the LDBC.

The results of the LDBC SNB Benchmark can be found in full on the LDBC website: http://ldbcouncil.org/benchmarks/snb