1. Jim Downing
  2. isitlinked

Overview

This code was hacked out as an entry for the JISC Dev8D bounty on government linked data. 

It aims to provide a simple dashboard of how well the triples exposed by a SPARQL endpoint comply with the rules of LinkedData Club: -

   * Use HTTP URIs. 
   * Respond to those URIs with useful information
   * Link to other data.

It's easy to just encode a load of RDF and bung it in a triplestore,
and expose a SPARQL endpoint to it - it's just not quite as useful as
LD. Linked data makes semantics much easier and more tractable to use,
semantic URLs are miles more useful than URIs that don't
dereference. I also find LD much easier than (e.g.) SPARQL to get a
feel for what schemas have been used, and the extent of the data. 

The code hoovers URIs from an endpoint, and analyzes each one by
sending HEAD requests, and using content negotiation to find semantic
data.

At the moment, the software simply displays the data as a table and a pie chart. Extensions that would be fun: -

   * Expose this data as LD, as a service
   * Periodic analyses to allow trends to be displayed
   * Split LD and non-LD URIs by domain (outgoing links / internal
    links). 

THE NUMBERS IN THE CURRENT OUTPUT SHOULD NOT BE TRUSTED TOO MUCH!