SkillAgentSearch skills...

NeuraLogic

Deep relational learning through differentiable logic programming.

Install / Use

/learn @GustikS/NeuraLogic
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

NeuraLogic

Maven CI GitHub release (latest by date including pre-releases) javadoc GitHub licence GitHub top language GitHub code size in bytes

Integrating deep and relational learning through differentiable logic programming.


About

At the core of the framework, there is a custom language you can use to write differentiable programs encoding your learning scenarios, similarly to classic Deep Learning (DL) frameworks (e.g. TensorFlow). However, the language follows a logic programming paradigm and is declarative in nature (it's similar to Datalog). This means that instead of directly encoding the computation graph, you just declare:

  1. the inputs (and their numeric values, if any)
    • i.e. the observed facts/data = objects, structures, knowledge graphs, relational databases, ...
    • e.g. atom(oxygen_1), 0.3 stable(oxygen), 8 protons(oxygen), 1.7 energy(oxygen,leve2), [1.2,0,-1] features(oxygen), [[0,2.1],[1.7,-1]] bond(oxygen_1,hydrogen_2,covalent_1)
  2. the outputs (and their expected values - for supervised learning)
    • i.e. the queries = classification labels, regression targets, ...
    • e.g. 1 class, 4.7 target(molecule_3), 0 relation(carbon,xenon,fluor)
  3. a set of rules applicable in your domain (and their learnable parameters W)
    • i.e. the generic prior/knowledge/bias which you want to use.
      • these rules will be used to (automatically) infer (link) the outputs from the inputs
    • e.g. explicit/interpretable rules such as 0.99 covalent(B) <= oxygen(X), hydrogen(Y), bond(X,Y,B).
    • or more implicit/flexible concepts such as embed(X) <= W_1 embed(Y), bond(X,Y,_).
      • with these you can easily encode a lot of diverse structured deep learning models!

Example

Consider a simple program for learning with molecular data<sup>1</sup>, encoding a generic idea that some hidden representation (predicate h(.)) of any chemical atom (variable X) is somewhat dependent on the other atoms (a(Y)) bound to it (b(X,Y)), with a parameterized rule as:

W_h1 h(X) <= W_a a(Y), W_b b(X,Y).

Additionally, let's assume that representation of a molecule (q) follows from representations of all the contained atoms (h(X)), i.e.:

W_q q <= W_h2 h(X).

These 2 rules, parameterized with the tensors W_*'s, then form a learning program, which can be used to classify molecules. Actually, it directly encodes a popular idea known as Graph Neural Networks. Execution of this program ("template") for 2 input molecule samples will generate 2 parameterized computational graphs as follows:

Template2Neural Grounding

Each computation node in the graphs is associated with some (differentiable) activation function defined by a user (or settings). The parameters W_* in the program are then automatically optimized to reflect the expected output values (A_q) through gradient descent.


For detailed syntax and semantics, please check out the concept of "Lifted Relational Neural Networks". For a deep dive into the principles in full scientific context, please see my dissertation thesis or the book on Deep learning with relational logic representations.


<a name="myfootnote1">1</a>: Note that NeuraLogic is by no means designed or limited to learning with chemical data/models/knowledge, but we use it as an example domain here for consistency.

Use Cases

While the framework can be used to encode anything from MLPs, CNNs, RNNs, etc., it is not well suited for classic deep learning with regular data based on large homogeneous tensor operations. The framework is rather meant for efficient encoding of deep relational learning scenarios<sup>2</sup>, i.e. using dynamic (weight-sharing) neural networks to learn from data with irregular structure(s). That is why we exploit the declarative language with first-order expressiveness, as it allows for compact encoding of complex relational scenarios (similarly to how Prolog can be elegant for relational problems, while not so much for classic programming).

The framework is mostly optimized for quick, high-level prototyping of learning scenarios with sparse, irregular, relational data and complex, dynamically structured models. Particularly, you may find it beneficial for encoding various:

<a name="myfootnote2">2</a>: if you come from deep learning background, you may be familiar with related terms such as "Geometric deep learning" or "Graph representation learning" (but this framework is not limited to graphs only).

Getting started

Prerequisite
Java ≥ 1.8 

Running examples

  1. download a release into some directory DIR
  2. clone this repository (or just download the Resources/datasets directory) within DIR
    • git clone https://github.com/GustikS/NeuraLogic
  3. try some trivial examples from terminal in DIR
    1. a simple familiar XOR problem
      • scalar: java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/neural/xor/naive
      • vectorized: java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/neural/xor/vectorized
    2. a simple relational problem (Example 2 from this paper)
      • java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/simple/family
    3. molecule classification problem (mutagenesis)
      • java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/relational/molecules/mutagenesis -ts 100
    4. knowledge-base completion problem (countries)
      • java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/relational/kbs/nations
  4. you can check various exported settings and results in DIR/target
    1. if you have Graphviz installed (which dot), you can observe the internal computation structures in debug mode:
      • java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/neural/xor/naive -iso -1 -prune -1 -debug all
        • this should show a graph of the 1) worflow, 2) template, 3) inference graphs, 4) neural networks + weight updates, and 5) final learned template
      • java -jar NeuraLogic.jar -sd ./NeuraLogic/Resources/datasets/simple/family -iso -1 -prune -1 -debug all
        • note we turn off network pruning and compression here so that you can observe direct correspondence to the original example

<a name="myfootnote3">3</a>: if you don't have Java in your system already, get it either from Oracle or OpenJDK

  • for simple usage, it is enough to get the runtime en
View on GitHub
GitHub Stars114
CategoryEducation
Updated12d ago
Forks15

Languages

Java

Security Score

100/100

Audited on Mar 15, 2026

No findings