Wiki
Clone wikisage_coding_project / Metrics_and_Channels
Metrics
The most important metrics are Hamming and Erasure, but it would be nice if it is not too difficult to work with other metrics within our framework.
 Softerror of Hamming metric

Rank metric
Sven Puchinger, phd student in Ulm, is implementing a (small) Gabidulin code library in Sage (still in its infancy). He is currently modelling his code after Codinglib. After the fact, he might be motivated to port our framework if it is not too much hassle.
There is a very big ticket implementing skew polynomials. It would be a great and relatively easy addition of lots of functionality to get this ticket reviewed.

Lee metric
Channels
By now we already have the following channels
 Static Error Rate (add t errors to a block)
 Error and Erasure (add t errors and e erasures to a block)
The above channels inherently work on blocks of symbols. Other such channels include
 Phased burst error channel (error bursts always begin/end at block delimits)
Other channels more naturally works on single symbols or streams thereof. Examples are:
 memoryless qary symmetric channel
 Burst error channel (nonphased, i.e. has a state "in burst or not")
 Additive Gaussian White Noise (operates on single real or complex numbers)
Sometimes channels should convert between representations of blocks and symbols
 Modulation (from GF(q) to Complex numbers)
 Interleaving (input s block channels and interleave their symbols to one big block)
 Bundling (bundle b successive symbols into one new symbol of an field. Not sure if name is good)
 Blocking (make a block of n successive symbols (doesn't change field like Bundling does))
 Inversions of above
For some coding paradigms, e.g. convolutional codes, we should regard the entire data as an (infinite) stream, not as blocks.
Relatively complicated combinations of the above could well be called for in a give application.
For instance, to test decoding of s
Interleaved ReedSolomon codes of length n
over GF(q)
under exactly t
"column errors", we could regard this as
(GF(q)^n)^s inputs > Interleaving (outputs (GF(q)^s)^n) > Unblocking (outputs GF(q)^s) > Bundling (outputs GF(q^s)) > Blocking (outputs GF(q^s)^n) > Static error rate (adds GF(q^s) errors) > Unblock (outputs GF(q^s)) > Unbundle (outputs GF(q)^s) > Block (outputs (GF(q)^s)^n) > Deinterleave (outputs (GF(q)^n)^s)
One could add flexibility to make the above much more convenient to use.
For instance, if StaticErrorRate
could add errors directly to GF(q)^s
elements, and not just GF(q^s)
, the tedious blocking/unblocking would not be needed, and we would have
Interleaving > StaticErrorRate > Deinterleave
An elegant way to support everything with a minimal number of specialised classes is by ensuring composability: implement channels that work on the level that makes most sense to them, and implement higherorder channels for converting between representations.
A common use case, like in the Interleaved RS example, to convert into some representation, where the errors are then applied, and then convert back again. Major simplification of such examples could be achieved if Channels can be composed in a surrounding manner. Something like the following would implement the Interleaved RS example:
Interleave o StaticErrorRate
where o
denotes surrounding Channel composition.
This would result in one object from the user's point of view, which takes s codewords and returns s received words from the same ambient space, with exactly t "column" errors added.
Softchannel example
Another example is transmitting GF(2^8)
elements over an AGWN channel using BPSK modulation of the field elements remembering reliability information, i.e. more precisely:
1. Split each field element into 8 bits 1. Convert each bit into 1, 1 as real numbers 1. Actual transmission: apply AGWN so anything in the range [1; 1] is received 1. Do harddecision back to 1, 1 but calculate the probability of making a wrong decision 1. Convert 1, 1 into 0,1 1. Reassemble the field elements of 8 bits, but use the bitlevel probabilitiess to calculate the probability of all other field elements as well.
This could be achieved by
Unblock o UnbundleSoft o BPSK o AGWNReal
The partial channel chains would then have the following types (taken from one of the components and the remaining chain)
AGWNReal: Real > Real BPSK: (F2*Proba > F2*Proba) UnbundleSoft: ((F(2^8) * Proba^8) > (F(2^8) * Proba^8)) Unblock: ((F(2^8) * Proba^8)^n > (F(2^8) * Proba^8)^n)
The reason for returning a Proba^8
is that the probability computations are quite heavy and would take up large amounts of time and memory, so they should be delayed until asked for.
For getting the probabilities, the user asks a method on UnbundleSoft
, which can take a set of th 8 reals as input. The computation is achieved by asking BPSK
what that number means as a probability. This in turn asks the AGWNReal
what that number means.
A softdecision decoder should as input space therefore take (GF(2^8) * R^8)^n
(as opposed to, in particular, GF(2^8)^n * R^8
). The decoder also needs the UndbundleSoft
channel as input in order to ask for the probabilities.
This way, one could change AGWNReal
to a different soft error channel, which adds soft errors with different probabilities, without changing the implementation of UnbundleSoft
or BPSK
.
Updated