Open-source graph database, built for real-time streaming data, compatible with Neo4j.
Go to file
2017-01-03 16:01:53 +01:00
benchmark src/demo was removed, src/examples and src/benchmarks were moved to the top level 2016-12-04 10:20:14 +01:00
build dressipi sprint 1 work in progress, labels function support, delete all 2016-10-11 02:43:41 +02:00
cmake Bug fixes: RELEASE MODE - asserts 2016-12-19 18:32:44 +01:00
config src/demo was removed, src/examples and src/benchmarks were moved to the top level 2016-12-04 10:20:14 +01:00
docker Dressipi CRUD queries are dummy implemented; Fixes T99 and T131 2016-11-29 04:08:49 +01:00
docs Doxygen setup 2016-12-20 15:49:52 +01:00
example src/demo was removed, src/examples and src/benchmarks were moved to the top level 2016-12-04 10:20:14 +01:00
include Merge branch 'dev' into mgcore_T159_inotify 2017-01-03 16:01:53 +01:00
libs google benchmark support 2016-11-18 17:35:29 +01:00
poc CMake refactor done - added single test granularity. Fixes T190, T194 2016-12-17 21:00:32 +01:00
release work in progress; TODO: change commit message 2016-12-14 10:27:41 +01:00
src Bugs from concurrent tests aren't solved. TODO: we have to introduce memory tracker for tests 2016-12-21 21:33:58 +01:00
tests Merge branch 'dev' into mgcore_T159_inotify 2017-01-03 16:01:53 +01:00
.arcconfig arc config change 2017-01-03 16:01:45 +01:00
.clang-format scenario 000001 pass, dressipi scenario 1 - 3 more queries have to be implemented 2016-11-02 16:05:02 +01:00
.gitignore Everything is prepared for code generation for query like MATCH (n:Label {properties}) RETURN n 2016-10-15 20:16:32 +02:00
CMakeLists.txt tests data bug fix - too many copies of tests/data 2016-12-18 10:16:07 +01:00
Doxyfile Doxygen setup 2016-12-20 15:49:52 +01:00
Doxylogo.png Doxygen setup 2016-12-20 15:49:52 +01:00
init.sh google benchmark support 2016-11-18 17:35:29 +01:00
README.md dressipi hard coded query 2016-09-25 12:51:58 +01:00
requirements.txt dressipi sprint 1 work in progress, labels function support, delete all 2016-10-11 02:43:41 +02:00

memgraph

Memgraph is an ACID compliant high performance transactional distributed in-memory graph database featuring runtime native query compiling, lock free data structures, multi-version concurrency control and asynchronous IO.

dependencies

Memgraph can be compiled using any modern c++ compiler. It mostly relies on the standard template library, however, some things do require external libraries.

Some code contains linux-specific libraries and the build is only supported on a 64 bit linux kernel.

  • linux
  • clang 3.5 or Gcc 4.8 (good c++11 support, especially lock free atomics)
  • boost 1.55 (or something, probably works with almost anything)
  • lexertl (2015-07-14)
  • lemon (parser generator)
  • catch (for compiling tests)

Bolt

sudo apt-get install libssl-dev

build

cd build
cmake -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ -DRUNTIME_ASSERT=OFF -DTHROW_EXCEPTION_ON_ERROR=OFF -DNDEBUG=OFF ..

# Flags:
#   -DCMAKE_C_COMPILER=clang
#   -DCMAKE_CXX_COMPILER=clang++
#   -DRUNTIME_ASSERT=OFF
#   -DTHROW_EXCEPTION_ON_ERROR=OFF
#   -DNDEBUG=ON
#   -DCMAKE_BUILD_TYPE:STRING=debug
#   -DCMAKE_BUILD_TYPE:STRING=release

make
ctest
ctest -V
ctest -R test_name
ctest -R unit
ctest -R concurrent
ctest -R concurrent --parallel 4

Proof of concept

Dressipi

Run command:

cd build/poc
MEMGRAPH_CONFIG=/path/to/config/memgraph.yaml ./poc_astar -v "dressipi/graph_nodes_export.csv" -e "dressipi/graph_edges_export.csv"

Powerlinx

Run command:

cd build/poc
MEMGRAPH_CONFIG=/path/to/config/memgraph.yaml ./profile -d ";" -v "node_account.csv" -v "node_company.csv" -v "node_opportunity.csv"  -v "node_personnel.csv" -e "rel_created.csv" -e "rel_has_match.csv" -e "rel_interested_in.csv" -e "rel_is_employee.csv" -e "rel_partnered_with.csv" -e "rel_reached_to.csv" -e "rel_searched_and_clicked.csv" -e "rel_viewed.csv" -e "rel_works_in.csv"

Custom test build example

clang++ -std=c++1y -o concurrent_skiplist ../tests/concurrent/skiplist.cpp -pg -I../src/ -I../libs/fmt -Lfmt-prefix/src/fmt-build/fmt -lfmt -lpthread

TODO

  • implement basic subset of queries:

    • create node
    • create edge
    • find node
    • find edge
    • update node
    • update edge
    • delete node
    • delete edge
  • implement index

    • label index
    • type index
    • property index
  • from header only approach to .hpp + .cpp

    • query engine has to be statically linked with the rest of the code
  • unit test of queries that are manually implemented

    • src/query_engine/main_queries.cpp -> tests/unit/manual_queries
  • unit test of query_engine

  • console with history