• 5 Posts
  • 667 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle


  • Can anyone explain why Wayland exists or who cares about it? X has been around forever, it sucks but it works and everything supports it. Alternatives like NeWS came around that were radically better, but were too soon or relied too much on corporate support, so they faded. The GNU project originally intended to write its own thing, but settled for using X. Now there’s Wayland though, which seems like a slight improvement over X, but mostly kind of a lateral move.

    If you’re going to replace X, why not do something a lot better? If not actual NeWS, then something that incorporates some of its ideas. I think Squeak was like that but I don’t know much about it.






  • Forth is fun but not really suitable for large, long-lasting projects with huge developer communities. Linux isn’t being bootstrapped, it’s already here and has been around for decades and it’s huge. And, I think bootstrapping-by-poking-around on a new architecture has stopped being important. Today, you have compiler and OS’s targeted to the new architecture under simulation long before there is any hardware, with excellent debugging tools available in the simulator.


  • I don’t think Ada in the kernel would get any cultural acceptance. Rust has been hard enough. C++ was vehemently rejected decades ago though the reasons made some sense at the time. Adopting C++ today would be pretty crazy. I don’t see much alternative to Rust (or in a different world, Ada) in the monolithic kernel. But Rust seems like it’s still in beta test, and the kernel architecture itself seems like a legacy beast. Do you know of anything else? I can’t take D or Eiffel or anything like that seriously. And part of it is the crappiness of the hardware companies. Maybe it will have to be left to future generations.


  • I have played with Ada but not done anything “real” with it. I think I’d be ok with using it. It seems better than C in most regards. I haven’t really looked into Rust but from what I can gather, its main innovation is the borrow checker, and Ada might get something like that too (influenced by Rust).

    I don’t understand why Linux is so huge and complicaed anyway. At least on servers, most Linux kernels are running under hypervisors that abstract away the hardware. So what else is going on in there? Linux is at least 10x as much code as BSD kernels from back in the day (idk about now). It might be feasible to write a usable Posix kernel as a hypervisor guest in a garbage collected language. But, I haven’t looked into this very much.

    Here’s an ok overview of Ada: http://cowlark.com/2014-04-27-ada/index.html





  • solrize@lemmy.worldtoProgramming@programming.devSafe C++
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    11 days ago

    I’ll look at the wiki article again but I can pretty much promise that Ada doesn’t have dependent types. They are very much a bleeding edge language feature (Haskell will get them soon, so I will try using them then) and Ada is quite an old fashioned language, derived from Pascal. SPARK is basically an extra-safe subset of Ada with various features disabled, that is also designed to work with some verification tools to prove properties of programs. My understanding is that the proof methods don’t involve dependent types, but maybe in some sense they do.

    Dependent types require the type system to literally be Turing-complete, so you can have a type like “prime number” and prove number-theoretic properties of functions that operate on them. Apparently that is unintentionally possible to do with C++ template metaprogramming, so C++ is listed in the article, but actually trying to use C++ that way is totally insane and impractical.

    I remember looking at the wiki article on dependent types a few years ago and finding it pretty bad. I’ve been wanting to read “The Little Typer” (thelittletyper.com) which is supposed to be a good intro. I’ve also played with Agda a little bit, but not used it for real.


  • solrize@lemmy.worldtoProgramming@programming.devSafe C++
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    13 days ago

    Dependent types only make sense in the context of static typing, i.e. compile time. In a dependently typed language, if you have a term with type {1,2,3,4,5,6,7} and the program typechecks at compile time, you are guaranteed that there is no execution path through which that term takes on a value outside that set. You may need to supply a complicated proof to help the compiler.

    In Ada you can define an integer type of range 1…7 and it is no big deal. There is no static guarantee like dependent types would give you. Instead, the runtime throws an exception if an out-of-range number gets sent there. It’s simply a matter of the compiler generating extra code to do these checks.

    There is a separate Ada-related tool called SPARK that can let you statically guarantee that the value stays in range. The verification method doesn’t involve dependent types and you’d use the tool somewhat differently, but the end result is similar.




  • In Ada? No dependent types, you just declare how to handle overflow, like declaring int16 vs int32 or similar. Dependent types means something entirely different and they are checked at compile time. SPARK uses something more like Hoare logic. Regular Ada uses runtime checks.



  • In Ada, the overflow behaviour is determined by the type signature. You can also sometimes use SPARK to statically guarantee the absence of overflow in a program. In Rust, as I understand it, you can control the overflow behaviour of a particular arithmetic operation by wrapping a function or macro call around it, but that is ugly and too easy to omit.

    For ordinary integers, an arithmetic overflow is similar to an OOB array reference and should be trapped, though you might sometimes choose to disable the trap for better performance, similar to how you might disable an array subscript OOB check. Wraparound for ordinary integers is simply incorrect. You might want it for modular arithmetic and that is fine, but in Ada you get that by specifying it in the type declaration. Also in Ada, you can specify the min and max bounds, or the modulus in the case of modular arithmetic. For example, you could have a “day of week as integer” ranging from 1 to 7, that traps on overflow.

    GNAT imho made an error of judgment by disabling the overflow check by default, but at least you can turn it back on.

    The RISC-V architecture designers made a harder to fix error by making everything wraparound, with no flags or traps to catch unintentional overflow, so you have to generate extra code for every arithmetic op.