Cachier
Library of GraphQL caching solutions - develop & deploy with GraphQL easier & faster. 3 npm packages under the name @cachier - client & server-side custom eviction policy, cache normalization & partial query retrieval. Demo app to visualize the efficiency of our solutions is launched!
Install / Use
/learn @oslabs-beta/CachierREADME
<a name="readme-top"></a>
[![Contributors][contributors-shield]][contributors-url] [![Forks][forks-shield]][forks-url] [![Stargazers][stars-shield]][stars-url] [![Issues][issues-shield]][issues-url] [![MIT License][license-shield]][license-url] [![LinkedIn][linkedin-shield]][linkedin-url]
<!-- PROJECT LOGO --> <br /> <div align="center"> <a href="https://github.com/oslabs-beta/Cachier"> <img src="demo/client/styles/cachierlogo.png" alt="Cachier Logo" title="Cachier Logo" width="520" height="180"> </a> <h3 align="center">Cachier</h3> <p align="center"> GraphQL caching tool with custom eviction policies, cache normalization. <br /> <a href="https://cachierql.com/"><strong>Explore the docs »</strong></a> <br /> <br /> <a href="https://github.com/oslabs-beta/Cachier">View Demo</a> · <a href="https://github.com/oslabs-beta/Cachier/issues">Report Bug</a> · <a href="https://github.com/oslabs-beta/Cachier/issues">Request Feature</a> </p> </div> <!-- TABLE OF CONTENTS --> <details> <summary>Table of Contents</summary> <ol> <li> <a href="#about-the-project">About The Project</a> <ul> <li><a href="#built-with">Built With</a></li> </ul> </li> <li> <a href="#getting-started">Getting Started</a> <ul> <li><a href="#if-using-redis">Prerequisites</a></li> <li><a href="#installation-and-import">Installation</a></li> </ul> </li> <li><a href="#usage">Usage</a></li> <li><a href="#roadmap">Roadmap</a></li> <li><a href="#contributing">Contributing</a></li> <li><a href="#license">License</a></li> <li><a href="#contributors">Contact</a></li> <li><a href="#works-cited">Acknowledgments</a></li> </ol> </details> <!-- ABOUT THE PROJECT -->About The Project
Built With
[![React][React.js]][React-url] [![Redis][Redis.io]][Redis-url] [![GraphQL][GraphQL.io]][GraphQL-url] [![Node/Express][Express.io]][Express-url] [![TailwindCSS][TailwindCSS.io]][Tailwind-url]
<p align="right">(<a href="#readme-top">back to top</a>)</p>Welcome to Cachier, a lightweight GraphQL caching tool that is configured specifically for GraphQL to reduce load times and minimize data fetching.
GraphQL does not have native HTTP caching as a result of its singular employment of the POST method, forcing the danger of over-fetching by re-running queries and slowing load times. Our team of engineers developed a compact, easy-to-use solution that allows users to cache their queries on the server side and client side!
Cachier currently offers:
- Storage inside session storage for client side caching
- Ability to choose between Redis and a native in memory cache
- Unique key generation for response data to avoid developer having to tag for the cache
- Partial and exact matching for query fields in the developer's GraphQL API
- Highly configurable eviction policies
We created a highly performant and customizable GraphQL caching library that consists of three main caching functions:
- Cachier Normalized Server-side Cache
- Cachier Direct Server-side Cache
- Cachier Direct Client-side Cache
We will go over each solution in detail below:
<!-- GETTING STARTED -->Getting Started
Cachier Normalized Server-side Cache
Cachier's Normalized Server-side Cache breaks up GraphQL queries into individual sub-queries to be stored in the cache. This provides maximum cache efficency by organizing data in a way that prevents data redundancy and allows for partial retrievals of subset data, thus drastically reducing network requests to the database.
Installation and Import
If this is your first time using Cachier's Normalized Cache, run the following command in your terminal.
npm install @cachier/cache-partials
In your server file, require our middleware to handle GraphQL requests using the CommonJS format
const Cachier = require('@cachier/cache-partials');
Set up your Cachier middleware function
endpoint - the endpoint that the client will make GraphQL queries to if it wants to utilize the cache.
graphQLEndpoint
- The graphQLEndpoint parameter is where you will specify your GraphQL APIs endpoint. This allows Cachier to route all queries that are unable to be resolved by the Cachier Cache to your GraphQL API.
cacheCapacity
- the cacheCapacity parameter allows you to specify a maximum cache length which allows cachier to know when to evict from the cache. All inputs for Capacity will be multiples of 100. The default parameter for Capacity is 100 (1000 keys in the cache).
sampleSize
- the sampleSize parameter allows the developer to configure the number of random keys that will be considered for eviction. The default sampleSize is 5 which we recommend for most applications.
evictionSize
- the sampleSize parameter allows the developer to configure the number of evictions what will be made when your cache capacity is reached. The default evictionSize is 5.
app.use(
endpoint,
Cachier(graphQLEndPoint, cacheCapacity, sampleSize, evictionSize);
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- USAGE EXAMPLES -->
Usage
app.use( '/Cachier', Cachier('https://api.spacex.land/graphql', 100, 5, 5) );
To fetch from Cachier's normalized cache you will fetch like you would to your GraphQL API except you will need set the option for uniques in the request body. The uniques object will need to contain a unique identifier for all list items in your query. You will need to include the list name as the key and the unique identifier as a the value. The unique identifier is any piece of data that is queried that is unique to each list item!
fetch('/graphql', {
method: 'POST',
headers:{
'Content-Type': 'application/json',
Accept: 'application/json',
},
body: JSON.stringify({
query: queryGraphQLString,
uniques: {listKey :uniqueIdentifier},
})
});
How it works
Example Fetch to SpaceX GQL API:
fetch('http://localhost:3000/partialCache', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Accept: 'application/json',
},
body: JSON.stringify({
query: {
dragons {
id
return_payload_mass {
kg
}
}
}
,
uniques: { dragons: 'id' },
}),
})
The client will fetch to the Cachier Cache endpoint with an object containing the query string and the unique types. The unique types need contain a unique identifier for all array/list items so that Cachier can generate a unique cache key.
{
"typesArr": [
"dragons"
],
"fieldsArr": [
[
"__typename",
"id",
{
"return_payload_mass": [
"__typename",
"kg"
]
}
]
]
}
Cachier parses incoming GraphQL queries and seperates them into subqueries stored in a "Cachier" object. The queries are broken up into 2 arrays typesArr, and fieldsArr where their respective indexes connect with one another. FieldsArr will be an array of arrays containing the fields for each cacheKey, if a field is nested it will be stored as a nested object. We will then wait for the return Data and use this "Cachier" query object to sort the data into our cache.
Here is data returned from our example query:
{
"data": {
"dragons": [
{
"id": "dragon2",
"return_payload_mass": {
"kg": 3000
}
},
{
"id": "dragon1",
"return_payload_mass": {
"kg": 3000
}
}
]
}
}
After receiving the data back Cachier will utilize the query map stored in the "Cachier" Object to normalize and store the data as individual keys inside the cache. This is how the data will look once normalized and stored in the cache:
{
"dragons": [
"dragon:dragon2",
"dragon:dragon1",
8492.694458007812
],
"dragon:dragon2": {
"__typename": "Dragon",
"id": "dragon2",
"return_payload_mass": {
"__typename": "Mass",
"kg": 3000
},
"__CachierCacheDate": 8492.681999921799
},
"dragon:dragon1": {
"__typename": "Dragon",
"id": "dragon1",
"return_payload_mass": {
"__typename": "Mass",
"kg": 3000
},
"__CachierCacheDate": 8492.691667079926
}
}
As you can see the dragons array now only stores references to keys in the cache and the data from the array is stored as seperate keys unique in the cache. This normalized cache structure eliminiates data redundancy in the cache and allows for partial retrieval of subset data. ("__CachierCacheData" fields and the number at the last index array is to keep track of recency for our eviction policy which we will speak about next).
Approximated LRU Eviction
Cachier's Normalized Cache uses a custom Approximated LRU Eviction Policy. This is not a true LRU implementation, but it comes very close in terms of performance. The reason Cachier does not use a true LRU implementation is because it costs more memory. Cachier's LRU policy works by creating a sample (the sample size can be configured by the developer) of randomly selected keys from the cache and evicting the least recently used key from the sample.
<p align="right">(<a href="#readme-top">back to top</a>)</p>Cachier Direct Server-side Cache
Cachier's Direct Server-side Cache uses a custom LRU-SLFR (Least Recently Used Smallest Latency First Replacement) policy. LRU-SLFR is very similar to LRU except it takes latency into account as well as recency when evicting. Cachier's LRU-SLFR eviction policy utilizes a linked hash map to ach
