Superlinked
No description available
Install / Use
/learn @superlinked/SuperlinkedREADME
Quickstart
%pip install superlinked
from superlinked import framework as sl
# Define schema for movie reviews
class Review(sl.Schema):
id: sl.IdField
text: sl.String
review = Review()
space = sl.TextSimilaritySpace(text=review.text, model="all-MiniLM-L6-v2")
index = sl.Index(space)
query = sl.Query(index).find(review).similar(space, sl.Param("search")).select_all()
# Setup and run
source = sl.InMemorySource(review)
app = sl.InMemoryExecutor(sources=[source], indices=[index]).run()
# Add data and search
source.put([
{"id": "1", "text": "Amazing acting and great story"},
{"id": "2", "text": "Boring plot with bad acting"}
])
result = app.query(query, search="excellent performance")
print(sl.PandasConverter.to_pandas(result))
<details>
<summary><strong>Table of Contents</strong></summary>
- Table of Contents
- Quickstart
- Overview
- Hands on tutorials
- Use-cases
- Experiment in a notebook
- Run in production
- Logging
- Resources
- Support
Overview
- WHY: Improve your vector search relevance by encoding metadata together with your unstructured data into vectors.
- WHAT: A framework and a self-hostable REST API server that connects your data, vector database and backend services.
- HOW: Construct custom data & query embedding models from pre-trained encoders from
sentence-transformers,open-clipand custom encoders for numbers, timestamps and categorical data. See the tutorials and use-case notebooks below for examples.
If you like what we do, give us a star! ⭐

Hands-on Tutorials
| Level | What you’ll build & learn | Try it now | |-------|---------------------------|------------| | Start here | Embed text · images · numbers · categories · time · events. | Text embedding <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/text_embedding.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Image embedding <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/image_embedding.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Numeric (min-max) <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/number_embedding_minmax.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Numeric (similar) <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/number_embedding_similar.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Categorical <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/categorical_embedding.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Recency embedding <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/recency_embedding.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Event effects <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/event_effects.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> | | Build & extend | Combine spaces or add custom / optional schemas. | Combine embeddings <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/combine_multiple_embeddings.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Custom space <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/custom_space.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Optional fields <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/optional_schema_fields.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> | | Optimise relevance | Real-time updates & query-time personalisation. | Dynamic parameters <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/dynamic_parameters.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Query-time weights <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/query_time_weights.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Query result <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/query_result.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> | | Search & filter | NL search + hard filters. | NL query <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/natural_language_querying.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Hard filtering <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/hard_filtering.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Query options <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/querying_options.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a><br>Vector parts <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/accessing_stored_vector_parts.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> | | Analyse & export | Sample embeddings offline. | Vector sampler <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/vector_sampler.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> | | Go multi-modal | Unified text + image space. | Multimodal search <a href="https://colab.research.google.com/github/superlinked/superlinked/blob/main/notebook/feature/image_embedding.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> |
💡 Want even more? Browse the complete list of features & concepts in our docs →
https://docs.superlinked.com/concepts/overview
Experiment in a notebook
Let's build an e-commerce product search that understands product descriptions and ratings:
Run the notebook example:
First run will take a minute to download the embedding model.
%pip install superlinked
import json
import os
from superlinked import framework as sl
class Product(sl.Schema):
id: sl.IdField
description: sl.String
rating: sl.Integer
product = Product()
description_space = sl.TextSimilaritySpace(
text=product.description, model="Alibaba-NLP/gte-large-en-v1.5"
)
rating_space = sl.NumberSpace(
number=product.rating, min_value=1, max_value=5, mode=s
