Great one liner on PL design:
Every programming language is designed to restrict trouble programmers can get into. The key is that they all have different ideas of trouble.
via Hacker News
Just another WordPress site
Great one liner on PL design:
Every programming language is designed to restrict trouble programmers can get into. The key is that they all have different ideas of trouble.
via Hacker News
So, for those not familiar, I go by toast` on the programming language IRC channels. So when I saw this video come up on youtube, with Mr. Engineer (my preferred TF2 class), I had a feeling of a certain kind of awesome. Like it was meant for me, or something.
Yeah, Toast!!
I was once asked “why bother learning these other languages, if you can’t use them”. It hadn’t occurred to this person that I code off hours. I’ve also been asked “what if you find these other languages actually are better than C++? Won’t you find it depressing having to code in a normal language then?”
It’s a good question. I don’t know if there’s one answer, but I’ve found mine.
Learning Lisp, Haskell, Python, and others has given me a greater appreciation for computer science, for the theory and universe of knowledge that lies underneath the surface of programming. Now when I code, I don’t just solve the problem at hand. I see different ways to solve it. I see other remote problems its related to. I appreciate the aesthetic of it all.
It’s not for everyone, but I find the mathematics underlying it all very beautiful.
And as for my day job in old-school-C++? Well, I hack together a prototype in Python in a couple hours, then spend a week translating it to C++ (adding verbosity, manual error handling, unfolding metaprogramming, etc). My boss doesn’t necessarily understand the whole languages thing, but he’s happy when I get a 2 wk job done in one. I’m happy too; having gotten through the icky bits of the problem in Python, I spend less time working on the boring bits in C++, and can move on to the next project sooner.
There is one last question, that I’m not sure I really have a good answer to. “Won’t you be tempted to use features that don’t exist in a normal language?” I’m hoping this will evaporate. Templates have existed in C++ since ’88 (pdf), various people have hacked lambdas in, and VC10 has added them for real. Short those facilities, it’s not like I have forgotten how to write imperative code. It’s still an option – I just now realize the development expenses of that option.
~
I’m curious, other answers do you choose for these question?
(Update, added Casting SPELs)
For better or worse, it’s accepted in our industry to be fluent in only a single programming language or technology. It’s lame, but there are plenty of excuses for this. “Turing equivalence” right? Finding quality tools is hard. And worse yet, to learn a new language you have to spend days and weeks reading some lame introductory text, right? Wrong!
There are tons of good intro guides out there that can get you up and running quickly, if only you know where to look. I’ve accumulated a bunch of these, and would like to spread the good wealth:
Found this old C++ source file in my scratch directory, apparently from last September. I don’t even remember writing this, but it’s written using my idioms. If I did write it, rest assured it was primarily for amusement purposes (abusement purposes?) only.
#include <iostream> using namespace std; int fibonacci(int x) { int a=0, b=1, temp; while(x?(--x,temp=a,a=b,b=temp+b):0); return a; } int main() { int i = 0; while(i<10&&(cout<<"fib("<<i<<")="<<fibonacci(i)<<'n',++i)); }
Yeah. That was terrible. Terrible awesome. But looking back, I’m realizing that some of the comma abuse is superfluous:
int fibonacci(int x){
int a=0, b=1;
while(x?(--x,a=((b=a+b)-a)):0);
return a;
}
Don’t let anyone check in code like this, ever.
… And for everyone else. Typography for Lawyers; an easy read, very accurate, and despite the title, the typography advice is really for everybody.
Well, maybe not for typographers. I get the impression this is all entry level stuff.
Quick call out to an illustrative blog entry on various cache effects.
http://igoro.com/archive/gallery-of-processor-cache-effects/
When someone bugs me about “X is too slow, because it has to make a virtual call”, and I get annoyed, it’s because a hot virtual call is an overhead of some dozen cycles or so. Missing the cache? In the thousands. Don’t get me wrong, virtual calls can matter for many reasons, but that all flies out the window the moment you’re working on any non-trivial sized data set. If your objects are 100’s of bytes large, you don’t worry about the virtual calls, you worry about shuffling their member slots around to squeeze more out of your caches.
“It better not allocate”
My other favorite perf quote from this month: “It better not allocate — this call needs to take 100 microseconds or less”. On my dev box, on the default Win7 heap, an uncontended small allocation (and the pairing free) is 120 ns — or 0.12 microseconds. My personal favorite small object allocator can hit down to 0.020 microseconds sustained.
We could allocate thousands of objects per call and still come in under budget.
Just a humorous software bug story, the case of the 500 mile email:
"We're having a problem sending email out of the department." "What's the problem?" I asked. "We can't send mail more than 500 miles," the chairman explained. I choked on my latte. "Come again?"
A must read for CS/IT folks.
It’s bad style, but I must start with an aside: on reddit/scheme, there was a link to a blog series on developing a Scheme interpreter over January 2010. It might not implement any particular Scheme standard or particularly many libraries, but it’s got all the functional elements. Bootstrapping a programming language is fun and easy.
Anyway, he also posted a his personal history of programming language study, and it got me thinking about my own personal programming languages history.
It all started with Logo…