Representing partial knowledge (on minimum distance)

Issue #75 resolved
Johan Rosenkilde created an issue

In particular wrt. the minimum distance, it is important that we can represent partial information: lower and upper bounds. For instance, Concatenated Codes have a nice apriori bound on the minimum distance, but this is not always tight. Similar things hold for many other classes.

I think that the LinearCode.minimum_distance() function should retain its semantics, i.e. that it returns the true minimum distance (possibly costing exponential work). Of course, for a GRS code where the n-k+1 is the true minimum distance, this function can be overridden.

To represent bounds, one possibility is minimum_distance_bound() which is a very code class specific function, which returns the known, cheap, good bound for this code class(es). For instance,

  • ConcatenatedCodes return d*D
  • Goppa returns deg(G) + 1, except if it's a binary Goppa code, then 2*deg(G)+1. Except possibly after some time when we get to implement the Siguyama-Kasahara-Hirasawa-Namekawa bound.
  • Cyclic Code returns the BCH bound, except if we have implemented the Hartmann-Tzeng.

One could imagine bounds which are quite expensive though not exponential. For instance, I believe on of the Cyclic Code bounds runs n^4, which a usual user might not want to wait for if he just wanted a quick, ok bound. In that case, an optional argument allows to ask for which method to compute (yet another registration system?)

Do we also need something like this for upper bounds? Am I totally overengineering this thing?

Comments (2)

  1. Daniel Augot

    Should not there be two functions: LinearCode.minimum_distance_lower_bound() and LinearCode.minimum_distance_upper_bound() ?

    And they could provide an "argument", with something like

    d_star,arg = LinearCode.minimum_distance_lower_bound()
    d
    9
    arg
    "Zeh's bound, auxiliary codes being the cyclic code blabla1 and the cyclic code blabla2"
    
  2. Log in to comment