[147486] in cryptography@c2.net mail archive

home help back first fref pref prev next nref lref last post

Re: [Cryptography] encoding formats should not be committee'ized

daemon@ATHENA.MIT.EDU (Stephan Neuhaus)
Thu Oct 3 12:40:53 2013

X-Original-To: cryptography@metzdowd.com
Date: Thu, 03 Oct 2013 18:12:33 +0200
From: Stephan Neuhaus <stephan.neuhaus@tik.ee.ethz.ch>
To: Peter Gutmann <pgut001@cs.auckland.ac.nz>
In-Reply-To: <E1VRdej-0005ax-OV@login01.fos.auckland.ac.nz>
Cc: leichter@lrw.com, cryptography@metzdowd.com
Errors-To: cryptography-bounces+crypto.discuss=bloom-picayune.mit.edu@metzdowd.com

On 2013-10-03 09:49, Peter Gutmann wrote:
> Jerry Leichter <leichter@lrw.com> writes:
> 
>> My favorite more recent example of the pitfalls is TL1, a language and
>> protocol used to managed high-end telecom equipment.  TL1 has a completely
>> rigorous syntax definition, but is supposed to be readable.
> 
> For those not familiar with TL1, "supposed to be readable" here means "encoded
> in ASCII rather than binary".  It's about as readable as EDIFACT and HL7.

Then that puts it in the same category as HBCI version 1.  Sure, it was
rigorous.  Sure, it was unambiguous.  Sure, it was ASCII-encoded.  But
human-readable?  I implemented that protocol once, and can assert that,
after reading more HBCI messages than was probably good for me, I felt
decidedly less than human.

Fun,

Stephan
_______________________________________________
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

home help back first fref pref prev next nref lref last post