From IRC :
mats: At the suggestion of an acquaintance, am re-reading ‘A Fire Upon the Deep’
* ascii_field liked it.
mats: Acquaintance noted that Vinge had an idea that predates ‘The Cult of Langsec’. That is: translation to an intermediate language with low expressive power, then back to whatever’s parsed by original recipient, as a means of avoiding pwnage by a powerful adversary.
ascii_field: The funny part is that this is actually how you ~guarantee~ ease of pwnage.i
mats: You think so?
ascii_field: Aha. x86. QED.ii The yarn had some imaginative and perhaps even good ideas, this was not one of these.
mats: Well, as you like to remind so often, x86 ain’t everything there is.iii
ascii_field: No but the basic idea of turning a high-level message into incomprehensible ground beef. (The actual result of translating from more-expressive language to a less-). Invariably subtracts from fits-in-head.
mats: Anyway, the langsec idea goes, input validation has a striking similarity to program verification, precluding inputs from driving unexpected state and computation.
ascii_field: Program verification is 1) provably unsolvable in the general case 2) to the extent it adds complexity and overall logical mass and subtracts from fit-in-head-ability, it is ~an evil~.
mats: Yes, and if you reject a inputs outside of a defined set of formalizable grammars, the more constrained the model and subsequently greater approachability towards program verification.
ascii_field: It is certainly both possible and useful in narrow cases. See the ada threads.iv But as an overall panacea, it is a USGism.v
mats: How conspiratorial.
BingoBoingo: Consipracy, conspiracy never ends.
ascii_field: I don’t know where you are posting from, but on my planet we recently saw astonishingly brazen clandestine techno-diddlatrons blown wide open.vii We are also bedeviled by pseudoscientific academitards who push them as ‘Security Research.’
BingoBoingo: Actual security research takes the form ‘X barrier can resist breach by thermal lance at Y intensity for Z minutes.’
mats: USG is a fount of grants for certain fields, computer security is one of those blessed.viii Consideration of all USG research as poisoned fruit is your prerogative, but for me, extraordinary and specific claims require extraordinarily specific evidence. Like someone diddling an IR parser.
ascii_field: It isn’t even that ‘Don’t use the work of Dr. X., it will kill you,’ but that overall DIRECTIONS of research that are given support by USG in past decade are specifically counterproductive. Calculatedly. E.g., the push for movement from RSA to ECC. Accompanied by unsubstantiated claims of ‘equivalence’ between long RSA and short ECC key ‘hardnesses’. Find me ~someone, anyone~ funded by American dollars who puts forth the opposing view from this.
pete_dushenski: Obligatory : http://trilema.com/2014/how-to-deal-with-pseudoscience/
assbot: How to deal with pseudoscience ? on Trilema – A blog by Mircea Popescu.
ascii_field: The thing is brazen and one-sided enough to make U.S. ‘climatology‘ look good.
pete_dushenski: Anyways mats, whatever they’re paying you, you’re definitely dancing hard enough for it. Or is that too conspiratorial of me ?ix
ascii_field: IIRC mats admitted to merely ~wishing~ to be bought.
pete_dushenski: ‘Careful what you wish for, you just might get it’ ™®x
mats: Yes, this is all well and good, but we were talking about IR parsers.xi
ascii_field: And I was talking about hygiene. Hygiene does pointedly ~not~ mean ‘We must inspect every roadkill for edible bits.’
mats: Or have we moved on to hand waving and accusations of being bought because honest discussion is too troublesome and insinuations about pwnage cannot be substantiated.xii
ascii_field: I pointedly do not care whether or by whom you were (or like to be) bought. My statement concerned heuristics for what intellectual pursuits are worth bothering with, given limited resource.
mats: Then I suppose I’ll conclude with disagreeing this is an unworthy area of research.xiii
ascii_field: Thing is, brain cycles worth half a damn are scarce. USG has a very effective program for soaking them up. On items GUARANTEED not to result in serious dings to ‘Nothing is beyond our reach.’ (E.g., from techno-lustration silicon and upwards).
BingoBoingo: I’m pretty sure the push to get people into grad school rather than employment 2007-2011 was exactly this.
ascii_field: Nah that’s mainly panem et circenses.xiv
BingoBoingo: Well that and taking a portion of hungry people and redirecting their worry so they burn themselves down instead of burning the whole show.
ascii_field: From USG’s point of view, ANYTHING is better than having folks wake up to CPU-with-bounds-checking-on-all-ops and fits-in-head.
mats: Well, if you ever decide to describe why ‘langsec’ et al. is a waste of time,xv I’d love to read it. You too, pete_dushenski. Waste of time, like whether and how advancement of formal languages for security is useful, or keeping input grammars regular whenever possible, verifiable parser, …
ascii_field: Start here: http://www.loper-os.org/?p=1390
assbot: Loper OS » Of Decaying Urbits.
ascii_field: Note that I did not say ‘waste of time’. If pursued as pure mathematics, it can be a mildly respectable thing.
mats: I am generally interested in these things because I like knowing about whether a given abstraction kills particular techniques.
ascii_field: It is not an uninteresting subject. Just as, say, historical climate patterns are not uninteresting. But both would become considerably more intellectually respectable if all of the current academic practitioners of each were to be shot.
mats: Now we’re getting somewhere. Which ‘langsec’ authors have a funny smell? What papers smell funny?
punkman: “LANGSEC posits that the only path to trustworthy software that takes untrusted inputs is treating all valid or expected inputs as a formal language, and the respective input-handling routines as a recognizer for that language.” << So what is the formal language when I grab random bytes off someone’s HTTP ?
ascii_field: This is fundamentally an example of what I was talking about. Deceive people into failing to so much suspect that the root of their probems can be dealt with, without telling any lies in the usual sense of the word. This is not ‘ordered from above’ in the naive way imagined by hecklers of ‘conspiratorial’ matters. Instead, it is baked into the ‘firmware’ of academia as a thing. No one who actually ~solves~ problems at the eliminate-a-whole-field level is remotely welcome. This is elementary.
mats: Well, just a thought here, functional programming looks like a better path forward than ada, if only because its easier to automate.
ascii_field: Automate ?
mats: Automating formal analyses.
ascii_field: I ~like~ functional programming, understand. Purely aesthetically. But there is no such thing, in our universe, as a ‘functional’ CPU. And the data structures popular among ‘functionality’ aficionados are not physically possible, but instead are clunkily emulated (‘immutability’) with actual ones. For instance, I like ‘ML’ (language) but i will not close my eyes to the fact that garbage collector is a cross-process info leaker. not to mention a source of nondeterminism my other problem is that i have not yet found an implementation of ml language that fits-in-head. this is not negotiable. and yes, i guess, being concerned with the number of intellectual ‘CPU cycles’ needed to fully grasp the ~implementation~ of the language and the machine under it – makes me a t3rr0r1st11111.
mats: I research what folks are doing to incrementally raise the cost of attack, because there is interesting work being done there, this is where the money is, and we are all living with various design decisions that can’t be undone at low cost. And obviously systems that can provably prevent things from happening are preferable…xvii
ascii_field: The only thing the offerings of USG ~provably~ are able to do is to lighten your wallet.
punkman: I’m all for langsec and chipsec and whatever other brand they come up with, but I don’t see anyone getting anywhere.
mats: EMET’s mitigations don’t look good.xviii And don’t obviously categorically defeat classes of bugs. Maybe this is just not the place for me to bring up such discussions, when folks clearly are not interested in this kind of research.
ascii_field: Try #bitcoin-wizards. (In all seriousness.) Buncha haskell nerds, etc. there.
mats: I read the logs sometimes.
ascii_field: Understand, I have no objection to tools such as computerized theorem-proving, data flow analysis, etc. except in that these are put forward as ~substitutes for fits-in-head simplicity~. There is not a substitute. Anyone who proposes one, directly or by implication, is (whether he knows it or not) committing pseudointellectual flimflammery in the service of Hitler.
So you see, even if the subject matter of this conversation was over your head, there’s an obvious and irreconcilable difference between those fighting off the Kraken in the boiling deep oceans on the one hand, and those swimming in a miniature fishbowl of their own feces, their eyes jammed shut with illogical crud, on the other.
Never the twain shall meet.
___ ___ ___
- Try this experiment sometime with human language. 1) Take a passage from Virgil’s Aeneid in the original Latin and run it through Google Translate into English, then 2) Run the translation back again into Latin. You think your powerful adversary’s going to be confused by the result or just think that you’re functionally retarded at best and at worst diddle a few bits and pwn your ass ? Hm. [↩]
- Quod erat demonstrandum, as has already been demonstrated. [↩]
- In logic, what mats does here is called a “hasty generalisation,” one of a raft of logical fallacies that he spews up in channel – and with disconcerting regularity, much to my personal annoyance and quite possibly to the pernicious persuasion of less diligent log readers.
The specious reasoning that mats employs and the seemingly endless scores of fallacies that he rests his defenses on really have no place in La Serenissma, for they lack either wisdom or logic :
La Serenissima’s anti-puritanism is developed from causes rather than towards explicit ideals, and it’s propelled by two incredibly powerful forces and an avowed belief in the righteousness therein. These forces being:
1) The prevailing power and wisdom of logic, and
2) The prevailing power and wisdom of Mircea Popescu.
- Eg. this one. [↩]
- In local #b-a parlance, a “USGism” is that which makes the world better for mentally improverished idiots and worse for the elites. In essence, it’s everything we fight against. [↩]
- I had to laugh ! Mainly because this is the exact same shit I negrated him over. For those keeping track and interested in a little WoT history, mats is only the second person I’ve negrated in the past ~2 years. Usagi, for being similarly stubborn about his ignorance, was the first. [↩]
- There are too many examples to list on this score, so check your delusions, man. [↩]
- So is potato research. What of it ? [↩]
- Gotta troll the kid a bit, y’know ? [↩]
- A reference from this video. Yes, that one in the middle is what Lewis Hamilton plows on the regular. Notbad.jpg [↩]
- Nice red herring attempt here but it’s not gonna fly. He has a few tricks up his sleeve, this one. [↩]
- A little “appeal to pity” action here. [↩]
- You’ll note that Stan never once said anything about the worthiness or unworthiness of computer security research, merely that the current idiots pretending to do it are just that : pretenders. Sure they’re doing it to put bread on the table and hopefully retire with a pension, but that doesn’t mean that their days aren’t numbered.
In essence, mats commits yet another logical error here. This time : the straw man fallacy. [↩]
- Bread and circuses, which, if you’ve had any contact with the USG in the last few decades, you’ll recognise as lab-modified “food” and professional sports. [↩]
- Another straw man. [↩]
- C’mon guys, let’s 4U70M473 all the thingz ! It’ll be cool and never go wrong !! Promise !!!1 [↩]
- This is an example of begging the question, another logical fallacy. Is anyone at home keeping tally of the number of fallacies that mats is so stubbornly committing in this one conversation ? [↩]
- As far as I can tell, EMET is some kind of winbloze sekoority scam. Seriously, no one with half a functioning brain gives a shit what Microsoft does or says in the name of digital security. Bill Gates can suck a fat cock. [↩]