• 0 Posts
  • 27 Comments
Joined 1 month ago
cake
Cake day: August 16th, 2024

help-circle

  • Bro he’ll never die. Chomsky will never die too. I’m serious. If Chomsky ‘stops living in his body’ his ideas for linguistics (not his political shit-flinging) will still be alive. When Knuth dies his programs and his algorithms and his books (especially his books) will still be alive. This is true about every single scientist or scholar. I did not even know Dijkstra was dead! He died when I was 9! But I just found it out, even though I love his works with C.A.R. Hoare, especially “Structured Programming”. I just might write a simple guide on the Hoare triple! So no scientists never die. Ken Thompson won’t die either. Bro will live in shell forever.















  • Wow that TLD! Perl has a rich history. Wall is the ‘computer scientist of people’. He’s a linguist by accreditation, but he’s got a lot of his credits with compsci stuff. He’s the guy who people like Alan Kay hate. Perl is a language that’s more ‘soul’ than it is ‘science’. But it excels at that. I love Perl, and Perl-derived languages like Ruby and Python – who try and Kay-ize Perl by making it closer to Smalltalk, and I think Python might have succeeded given how popular it is! So in one camp, we have ultra-scientfic-y languages like Haskell. Simon Peyton-Jones has a tutorial on writing a functional language in Haskell (well Miranda but you get it!). But there’s no tutorial on how to write Perl: You have to have soul, and 40 years of spare time, to re-create Perl. That’s why there’s not much of re-implementations. Perl is just Perl. One and only. Perl6 got away from it. Tried to ‘specify’. But if I wanted specs, I’d go to Scheme! When I want soul, I come to Perl.


  • I learned most of these stuff (stuff as oblique as Martin Lohf’s System F, which most modern polymorphic types are derived from) on my own. I have ‘Handbook of Satisfiability’ put aside so I can learn how SAT/SMT solvers work. I need them for my electronics circuit simulator. Keep in mind that there are other NPSPACE and NPTIME solvers, for example, BDD solvers (Binary Decision Diagrams). BDD solvers are much easier if you wanna give it a try! People who say “you don’t need college” are seriously missing out on key stuff that helps them understand their day-to-day tasks. Like, in my self-studies, I learned about some neat, oblique stuff. For example, I learned about dataflow languages and Lucid. I learned about Church-Rosser’s confluence. I learned about Curry-Howard Isomorphism. I learned about so many cool stuff! I’m going back to college because college is cheap here, but I’m a 3rdie, college is cheap but even the cheapest college from the US or Europe is much better than “Payam Nur University” is it not? (That’s the college I’m going to, it’s state-funded studies for prisoners!) Also, Americans will have a much easier time getting into an European university. You guys could try that. College is almost-free for me but you guys have access to much better technological foundations in Silicon Valley. Here, you have to include your own batteries. For example, to fund my college (not the tuition, but I have to live right, and my uncle is having a hard time supporting a dozen people, sanctions are getting rough) I’m going to open up a ‘keyboard workshop’. Like I make mechanical keyboards and send it to people.

    So if you don’t wish/can’t study in college, there’s always self-study. This book began my journey, just like many others: Introduction to the Theory of Computation, 3rd edition -- Michael Sipser

    You also need to read papers. So here’s your first homework: Find this book on Google Scholar. You’d be needing it _a lot _!

    It’s sort of a ‘meme’. Dip in. Self-learning is not ‘second bests’ of college students! Especially some shithole like mine. Imagine this, your computer is like a calculator. Use it to learn computation!

    Thanks.


  • Oh I’m just a poseur! As it is my understanding, Usenet was for ‘grownup people’ (like Larry Wall, check the interview with him I just posted!) and BBS was for ‘hacker scene kids’. I started using the internet in 2005 when I was 12 (I did use it sparingly before though) and back in 2005 the ‘web-ization of the internet’ was not as pronounced as it is now, but it was strong enough that I had issues understanding where the interned ends, and the web begins! But well, it’s in the name, is it not? :D



  • Just remember that back when Knuth wrote this, there was no such thing as ‘scripting’. So if you don’t necessarily ‘program’ but ‘script’ a lot, that’s the same. With scripting, the cleverness is not in algorithms you use or stuff like that, it’s as you said, clever use of resources. I have a story to tell:

    A few hours ago my brother showed me this guy on Twitter telling people that, he’s asked people to ‘partition an array of numbers and null on null’ (in JS) and he showed his original solution, which was an iterative solution, very non-functional in style, and I kinda don’t like code that is just “too” imperative you know? Then my brother showed me someone’s solution.

    const arr = [21, 242, 1135, null, 1341, null, 2424, 11, 22, 444 ];
    // solution
    arr.join(',').split(',,').map(subarray => subarray.split(','))
    

    Golfing like this is exactly what would make Knuth cry! I wish people understood that golfing is not very readable! But understanding why this happens is what makes you more endearing, to me at least! This happens because of Javascript’s orthogonality, a remnant of Smalltalk-80 being retrofitted into a prototype-based system! Also, this is because ECMA-262 has specified ‘join’ to ignore ‘null’. Because ‘null’ is not a prototype, it’s baked into the grammar! It has no way to respond to ‘join’ so it gets left out! This is because of how Smalltalk-80-based languages are just “always communicating”. “null” is part of grammar, it does not communicate, so it gets left out.

    Now, knowing that, this code looks more beautiful right?