SkillAgentSearch skills...

Rdf.sh

A multi-tool shell script for doing Semantic Web jobs on the command line.

Install / Use

/learn @seebi/Rdf.sh

README

rdf.sh

A multi-tool shell script for doing Semantic Web jobs on the command line.

Build Status

contents

<a name="installation"></a>

installation

manually

rdf.sh is a single bash shell script so installation is trivial ... :-) Just copy or link it to you path, e.g. with

sudo ln -s /path/to/rdf.sh /usr/local/bin/rdf

debian / ubuntu

You can download a debian package from the release section and install it as root with the following commands:

sudo dpkg -i /path/to/your/rdf.sh_X.Y_all.deb
sudo apt-get -f install

The dpkg run will probably fail due to missing dependencies but the apt-get run will install all dependencies as well as rdf.

Currently, zsh is a hard dependency since the zsh completion "needs" it.

brew based

You can install rdf.sh by using the provided recipe:

brew install https://raw.githubusercontent.com/seebi/rdf.sh/develop/brew/rdf.sh.rb

This will install the latest stable version. In case you want to install the latest develop version, use this command:

brew install --HEAD https://raw.githubusercontent.com/seebi/rdf.sh/develop/brew/rdf.sh.rb

docker based

You can install rdf.sh by using the provided docker image:

docker pull seebi/rdf.sh

After that, you can e.g. run this command:

docker run -i -t --rm seebi/rdf.sh desc foaf:Person

<a name="dependencies"></a>

dependencies

Required tools currently are:

  • roqet (from rasqal-utils)
  • rapper (from raptor-utils or raptor2-utils)
  • curl

Suggested tools are:

  • zsh (without the autocompletion, it is not the same)

<a name="files"></a>

files

These files are available in the repository:

  • README.md - this file
  • _rdf - zsh autocompletion file
  • CHANGELOG.md - version change log
  • doap.ttl - doap description of rdf.sh
  • rdf.1 - rdf.sh man page
  • rdf.sh - the script
  • Screenshot.png - a screeny of rdf.sh in action
  • example.rc - an example config file which can be copied

These files are used by rdf.sh:

  • $HOME/.cache/rdf.sh/resource.history - history of all processed resources
  • $HOME/.cache/rdf.sh/prefix.cache - a cache of all fetched namespaces
  • $HOME/.config/rdf.sh/prefix.local - locally defined prefix / namespaces
  • $HOME/.config/rdf.sh/rc - config file

rdf.sh follows the XDG Base Directory Specification in order to allow different cache and config directories.

<a name="usage-features"></a>

usage / features

<a name="overview"></a>

overview

rdf.sh currently provides these subcommands:

  • color: get a html color for a resource URI
  • count: count distinct triples
  • delete: deletes an existing linked data resource via LDP
  • desc: outputs description of the given resource in a given format (default: turtle)
  • diff: diff of triples from two RDF files
  • edit: edit the content of an existing linked data resource via LDP (GET + PUT)
  • get: fetches an URL as RDF to stdout (tries accept header)
  • get-ntriples: curls rdf and transforms to ntriples
  • gsp-delete: delete a graph via SPARQL 1.1 Graph Store HTTP Protocol
  • gsp-get: get a graph via SPARQL 1.1 Graph Store HTTP Protocol
  • gsp-put: delete and re-create a graph via SPARQL 1.1 Graph Store HTTP Protocol
  • head: curls only the http header but accepts only rdf
  • headn: curls only the http header
  • help: outputs the manpage of rdf
  • list: list resources which start with the given URI
  • ns: curls the namespace from prefix.cc
  • nscollect: collects prefix declarations of a list of ttl/n3 files
  • nsdist: distributes prefix declarations from one file to a list of other ttl/n3 files
  • nssort: sorts the prefix declarations in files
  • put: replaces an existing linked data resource via LDP
  • split: split an RDF file into pieces of max X triple and output the file names
  • turtleize: outputs an RDF file in turtle, using as much as possible prefix declarations

<a name="nslookup"></a>

namespace lookup (ns)

rdf.sh allows you to quickly lookup namespaces from prefix.cc as well as locally defined prefixes:

$ rdf ns foaf
http://xmlns.com/foaf/0.1/

These namespace lookups are cached (typically $HOME/.cache/rdf.sh/prefix.cache) in order to avoid unneeded network traffic. As a result of this subcommand, all other rdf command can get qnames as parameters (e.g. foaf:Person or skos:Concept).

To define you own lookup table, just add a line

prefix|namespace

to $HOME/.config/rdf.sh/prefix.local. rdf.sh will use it as a priority lookup table which overwrites cache and prefix.cc lookup.

rdf.sh can also output prefix.cc syntax templates (uncached):

$ rdf ns skos sparql
PREFIX skos: <http://www.w3.org/2004/02/skos/core#>

SELECT *
WHERE {
  ?s ?p ?o .
}

$ rdf ns dct n3
@prefix dct: <http://purl.org/dc/terms/>.

<a name="description"></a>

resource description (desc)

Describe a resource by querying for statements where the resource is the subject. This is extremly useful to fastly check schema details.

$ rdf desc foaf:Person
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix owl: <http://www.w3.org/2002/07/owl#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix geo: <http://www.w3.org/2003/01/geo/wgs84_pos#> .
@prefix contact: <http://www.w3.org/2000/10/swap/pim/contact#> .

foaf:Person
    a rdfs:Class, owl:Class ;
    rdfs:comment "A person." ;
    rdfs:isDefinedBy <http://xmlns.com/foaf/0.1/> ;
    rdfs:label "Person" ;
    rdfs:subClassOf contact:Person, geo:SpatialThing, foaf:Agent ;
    owl:disjointWith foaf:Organization, foaf:Project ;
    <http://www.w3.org/2003/06/sw-vocab-status/ns#term_status> "stable" .

In addition to the textual representation, you can calculate a color for visual resource representation with the color command:

$ rdf color http://sebastian.tramp.name
#2024e9

Refer to the cold webpage (archived) for more information. :-)

<a name="gsp"></a>

SPARQL graph store protocol client

The SPARQL 1.1 Graph Store HTTP Protocol describes the use of HTTP operations for the purpose of managing a collection of RDF graphs. rdf.sh supports the following commands in order to manipulate graphs:

Syntax: rdf gsp-get <graph URI | Prefix:LocalPart> <store URL | Prefix:LocalPart (optional)>
(get a graph via SPARQL 1.1 Graph Store HTTP Protocol)
Syntax: rdf gsp-put <graph URI | Prefix:LocalPart> <path/to/your/file.rdf> <store URL | Prefix:LocalPart (optional)>
(delete and re-create a graph via SPARQL 1.1 Graph Store HTTP Protocol)
Syntax: rdf gsp-delete <graph URI | Prefix:LocalPart> <store URL | Prefix:LocalPart (optional)>
(delete a graph via SPARQL 1.1 Graph Store HTTP Protocol)

If the store URL is not given, the Direct Graph Identification is assumed, which means the store URL is taken as the graph URL. If the store URL is given, Indirect Graph Identification is used.

<a name="ldp"></a>

linked data platform client

The Linked Data Platform describe a read-write Linked Data architecture, based on HTTP access to web resources that describe their state using the RDF data model. rdf.sh supports DELETE, PUT and edit (GET, followed by an edit command, followed by a PUT request) of Linked Data Platform Resources (LDPRs).

Syntax: rdf put <URI | Prefix:LocalPart> <path/to/your/file.rdf>
(replaces an existing linked data resource via LDP)
Syntax: rdf delete <URI | Prefix:LocalPart>
(deletes an existing linked data resource via LDP)
Syntax: rdf edit <URI | Prefix:LocalPart>
(edit the content of an existing linked data resource via LDP (GET + PUT))

The edit command uses the EDITOR variable to start the editor of your choice with a prepared turtle file. You can change the content of that file (add or remove triple) and you can use any prefix you've already declared via config or which is cached. Used prefix declarations are added automatically afterwards and the file is the PUTted to the server.

<a name="webid"></a>

WebID requests

In order to request ressources with your WebID client certificate, you need to setup the rdf.sh rc file (see configuration section). Curl allows for using client certs with the -E parameter, which needs a pem file with your private key AND the certificate.

To use your proper created WebID pem file, just ad

Related Skills

View on GitHub
GitHub Stars126
CategoryDevelopment
Updated5mo ago
Forks8

Languages

Shell

Security Score

97/100

Audited on Oct 9, 2025

No findings